Oct 01 13:37:06 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 13:37:06 crc restorecon[4678]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:06 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:37:07 crc restorecon[4678]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 13:37:08 crc kubenswrapper[4774]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.586347 4774 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589828 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589855 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589861 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589868 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589874 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589880 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589886 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589892 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589897 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589903 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589909 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589914 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589919 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589925 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589930 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589936 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589941 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589946 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589951 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589956 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589961 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589967 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589972 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589976 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589984 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589991 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.589998 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590003 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590009 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590016 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590022 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590028 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590034 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590039 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590044 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590050 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590055 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590060 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590074 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590079 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590086 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590091 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590096 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590101 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590106 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590111 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590116 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590121 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590126 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590132 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590137 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590141 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590147 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590151 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590157 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590163 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590169 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590176 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590182 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590187 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590193 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590198 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590204 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590212 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590219 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590225 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590230 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590235 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590240 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590245 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.590250 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591044 4774 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591063 4774 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591074 4774 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591081 4774 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591089 4774 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591095 4774 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591103 4774 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591110 4774 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591117 4774 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591123 4774 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591129 4774 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591136 4774 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591142 4774 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591148 4774 flags.go:64] FLAG: --cgroup-root="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591154 4774 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591160 4774 flags.go:64] FLAG: --client-ca-file="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591166 4774 flags.go:64] FLAG: --cloud-config="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591172 4774 flags.go:64] FLAG: --cloud-provider="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591178 4774 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591186 4774 flags.go:64] FLAG: --cluster-domain="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591192 4774 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591199 4774 flags.go:64] FLAG: --config-dir="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591205 4774 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591211 4774 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591218 4774 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591224 4774 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591231 4774 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591238 4774 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591244 4774 flags.go:64] FLAG: --contention-profiling="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591251 4774 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591259 4774 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591266 4774 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591271 4774 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591279 4774 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591285 4774 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591291 4774 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591297 4774 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591303 4774 flags.go:64] FLAG: --enable-server="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591309 4774 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591316 4774 flags.go:64] FLAG: --event-burst="100" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591323 4774 flags.go:64] FLAG: --event-qps="50" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591329 4774 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591335 4774 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591342 4774 flags.go:64] FLAG: --eviction-hard="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591349 4774 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591355 4774 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591362 4774 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591389 4774 flags.go:64] FLAG: --eviction-soft="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591396 4774 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591402 4774 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591408 4774 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591414 4774 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591420 4774 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591426 4774 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591432 4774 flags.go:64] FLAG: --feature-gates="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591439 4774 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591445 4774 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591480 4774 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591486 4774 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591493 4774 flags.go:64] FLAG: --healthz-port="10248" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591499 4774 flags.go:64] FLAG: --help="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591506 4774 flags.go:64] FLAG: --hostname-override="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591511 4774 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591518 4774 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591525 4774 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591531 4774 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591537 4774 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591543 4774 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591549 4774 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591555 4774 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591561 4774 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591568 4774 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591574 4774 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591580 4774 flags.go:64] FLAG: --kube-reserved="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591586 4774 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591592 4774 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591598 4774 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591636 4774 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591643 4774 flags.go:64] FLAG: --lock-file="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591649 4774 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591656 4774 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591662 4774 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591671 4774 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591677 4774 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591683 4774 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591689 4774 flags.go:64] FLAG: --logging-format="text" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591696 4774 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591702 4774 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591708 4774 flags.go:64] FLAG: --manifest-url="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591713 4774 flags.go:64] FLAG: --manifest-url-header="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591721 4774 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591728 4774 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591735 4774 flags.go:64] FLAG: --max-pods="110" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591741 4774 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591747 4774 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591753 4774 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591759 4774 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591766 4774 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591772 4774 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591779 4774 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591792 4774 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591799 4774 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591805 4774 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591811 4774 flags.go:64] FLAG: --pod-cidr="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591831 4774 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591839 4774 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591846 4774 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591852 4774 flags.go:64] FLAG: --pods-per-core="0" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591858 4774 flags.go:64] FLAG: --port="10250" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591864 4774 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591870 4774 flags.go:64] FLAG: --provider-id="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591876 4774 flags.go:64] FLAG: --qos-reserved="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591882 4774 flags.go:64] FLAG: --read-only-port="10255" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591888 4774 flags.go:64] FLAG: --register-node="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591894 4774 flags.go:64] FLAG: --register-schedulable="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591900 4774 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591910 4774 flags.go:64] FLAG: --registry-burst="10" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591916 4774 flags.go:64] FLAG: --registry-qps="5" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591922 4774 flags.go:64] FLAG: --reserved-cpus="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591928 4774 flags.go:64] FLAG: --reserved-memory="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591935 4774 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591942 4774 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591948 4774 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591954 4774 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591960 4774 flags.go:64] FLAG: --runonce="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591966 4774 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591972 4774 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591978 4774 flags.go:64] FLAG: --seccomp-default="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591984 4774 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.591990 4774 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592004 4774 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592010 4774 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592017 4774 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592023 4774 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592029 4774 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592036 4774 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592042 4774 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592049 4774 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592055 4774 flags.go:64] FLAG: --system-cgroups="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592061 4774 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592070 4774 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592076 4774 flags.go:64] FLAG: --tls-cert-file="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592082 4774 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592090 4774 flags.go:64] FLAG: --tls-min-version="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592096 4774 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592102 4774 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592108 4774 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592114 4774 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592120 4774 flags.go:64] FLAG: --v="2" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592127 4774 flags.go:64] FLAG: --version="false" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592135 4774 flags.go:64] FLAG: --vmodule="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592145 4774 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592151 4774 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592284 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592293 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592300 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592308 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592315 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592322 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592328 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592334 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592340 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592347 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592353 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592361 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592366 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592372 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592377 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592382 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592388 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592393 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592399 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592404 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592410 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592416 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592421 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592426 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592432 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592437 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592442 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592466 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592473 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592478 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592483 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592488 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592493 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592498 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592503 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592509 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592514 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592519 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592526 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592533 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592539 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592547 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592553 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592560 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592565 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592570 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592575 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592580 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592585 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592590 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592595 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592600 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592605 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592610 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592616 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592621 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592626 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592632 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592637 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592642 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592647 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592652 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592657 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592662 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592667 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592672 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592677 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592682 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592687 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592692 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.592697 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.592706 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.602603 4774 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.602639 4774 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602731 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602740 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602745 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602751 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602756 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602761 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602766 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602771 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602775 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602780 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602785 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602789 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602794 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602798 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602803 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602808 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602812 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602817 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602821 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602826 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602830 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602835 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602839 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602844 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602850 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602857 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602863 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602867 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602872 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602879 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602884 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602890 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602895 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602901 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602906 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602911 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602916 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602921 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602928 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602934 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602938 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602943 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602947 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602952 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602957 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602962 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602966 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602971 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602975 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602980 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602985 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602989 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602994 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.602999 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603004 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603009 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603014 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603020 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603025 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603030 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603035 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603040 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603046 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603052 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603059 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603067 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603073 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603078 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603085 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603090 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603095 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.603104 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603234 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603245 4774 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603251 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603256 4774 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603260 4774 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603265 4774 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603270 4774 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603274 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603279 4774 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603284 4774 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603290 4774 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603294 4774 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603298 4774 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603303 4774 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603308 4774 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603314 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603318 4774 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603322 4774 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603326 4774 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603330 4774 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603336 4774 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603341 4774 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603346 4774 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603351 4774 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603355 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603359 4774 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603364 4774 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603368 4774 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603372 4774 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603377 4774 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603381 4774 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603385 4774 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603389 4774 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603393 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603397 4774 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603401 4774 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603406 4774 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603412 4774 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603416 4774 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603420 4774 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603424 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603430 4774 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603435 4774 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603440 4774 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603444 4774 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603473 4774 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603478 4774 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603483 4774 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603487 4774 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603491 4774 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603495 4774 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603500 4774 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603504 4774 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603509 4774 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603513 4774 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603518 4774 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603522 4774 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603526 4774 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603530 4774 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603534 4774 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603540 4774 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603545 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603550 4774 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603555 4774 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603559 4774 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603563 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603568 4774 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603572 4774 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603577 4774 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603581 4774 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.603585 4774 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.603621 4774 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.604796 4774 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.609799 4774 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.609906 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.611549 4774 server.go:997] "Starting client certificate rotation" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.611579 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.612712 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-05 16:15:24.433577386 +0000 UTC Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.612820 4774 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1562h38m15.820760734s for next certificate rotation Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.652926 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.655958 4774 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.682632 4774 log.go:25] "Validated CRI v1 runtime API" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.743355 4774 log.go:25] "Validated CRI v1 image API" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.745556 4774 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.755518 4774 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-13-33-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.755594 4774 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.776079 4774 manager.go:217] Machine: {Timestamp:2025-10-01 13:37:08.772503004 +0000 UTC m=+0.662133681 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:75fe681e-c594-4ab2-ad84-cd261c47a27a BootID:43812622-110d-4c9c-94ff-65b8a298322f Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fe:eb:12 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:fe:eb:12 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:21:c3:dd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:14:9a:89 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d6:40:8e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:95:ef:e1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ae:b5:56:f0:f9:ce Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:36:ba:93:d2:bc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.776507 4774 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.776692 4774 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.777281 4774 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.777636 4774 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.777690 4774 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.777980 4774 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.777994 4774 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.778656 4774 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.778698 4774 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.779377 4774 state_mem.go:36] "Initialized new in-memory state store" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.779905 4774 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.784623 4774 kubelet.go:418] "Attempting to sync node with API server" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.784663 4774 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.784696 4774 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.784713 4774 kubelet.go:324] "Adding apiserver pod source" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.784726 4774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.789137 4774 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.791615 4774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.794255 4774 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.794291 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.794314 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.794387 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.794406 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795774 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795806 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795817 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795826 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795842 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795851 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795861 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795876 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795889 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795899 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795934 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.795943 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.796973 4774 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.797590 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.797633 4774 server.go:1280] "Started kubelet" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.797796 4774 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.797845 4774 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.798367 4774 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800244 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800274 4774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800367 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:58:34.148362978 +0000 UTC Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800502 4774 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1219h21m25.347866548s for next certificate rotation Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800570 4774 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.800588 4774 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.800625 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 13:37:08 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.801124 4774 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.801325 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.801413 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.801440 4774 factory.go:55] Registering systemd factory Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.801485 4774 factory.go:221] Registration of the systemd container factory successfully Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.801879 4774 factory.go:153] Registering CRI-O factory Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.801909 4774 factory.go:221] Registration of the crio container factory successfully Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.802009 4774 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.802035 4774 factory.go:103] Registering Raw factory Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.802052 4774 manager.go:1196] Started watching for new ooms in manager Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.802794 4774 manager.go:319] Starting recovery of all containers Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.802942 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="200ms" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.810538 4774 server.go:460] "Adding debug handlers to kubelet server" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.809486 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a61851da6933f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 13:37:08.797600575 +0000 UTC m=+0.687231212,LastTimestamp:2025-10-01 13:37:08.797600575 +0000 UTC m=+0.687231212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.823897 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.823998 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824023 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824041 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824071 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824091 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824111 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824135 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824157 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824176 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824196 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824214 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824236 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824260 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824280 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824301 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824321 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824341 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824358 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824376 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824395 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824416 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824436 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824483 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824502 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824522 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824544 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824567 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824590 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824610 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824628 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824649 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824670 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824688 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824738 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824755 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824774 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824796 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824814 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824833 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824852 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824870 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824889 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824909 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824927 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824945 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.824965 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825008 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825026 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825046 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825066 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825084 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825109 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825131 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825171 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825201 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825223 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825243 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825260 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825278 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825296 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825316 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825335 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825392 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825421 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825439 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825484 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825503 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825521 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825540 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825558 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825576 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825595 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825613 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825633 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825652 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825671 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825690 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825708 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825729 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825747 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825765 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825782 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825803 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825820 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825841 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825860 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825878 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825897 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825916 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825935 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825953 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825976 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.825993 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826012 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826032 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826055 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826075 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826094 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826112 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826131 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826148 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826168 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826185 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826219 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826239 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826256 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826276 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826295 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826314 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826334 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826358 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826377 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826396 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826412 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826429 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826521 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826541 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826559 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826577 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826596 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826613 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826630 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826650 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826666 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826685 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826702 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826722 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826739 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826759 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826776 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826794 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826812 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826832 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826850 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826868 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826888 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826906 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826922 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826942 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826960 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826977 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.826998 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827018 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827035 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827053 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827070 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827089 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827108 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827125 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827143 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827162 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827181 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827199 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827217 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827237 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827260 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827280 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827300 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827325 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827344 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827363 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827383 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827400 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827420 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827447 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827520 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827546 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827565 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827584 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827602 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827623 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827641 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827660 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827679 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827700 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827719 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827737 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.827756 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.838637 4774 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839209 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839366 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839588 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839726 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839842 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.839959 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840079 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840216 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840336 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840484 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840607 4774 manager.go:324] Recovery completed Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840629 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.840898 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841026 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841144 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841257 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841373 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841539 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841671 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.841837 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842000 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842123 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842275 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842594 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842763 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.842902 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.843034 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.843177 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.843347 4774 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.843529 4774 reconstruct.go:97] "Volume reconstruction finished" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.843668 4774 reconciler.go:26] "Reconciler: start to sync state" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.851252 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.857271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.857319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.857335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.859387 4774 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.859410 4774 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.859433 4774 state_mem.go:36] "Initialized new in-memory state store" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.867604 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.869117 4774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.869161 4774 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.869190 4774 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.869237 4774 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 13:37:08 crc kubenswrapper[4774]: W1001 13:37:08.871832 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.871936 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.894093 4774 policy_none.go:49] "None policy: Start" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.895611 4774 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.895637 4774 state_mem.go:35] "Initializing new in-memory state store" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.900892 4774 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.954773 4774 manager.go:334] "Starting Device Plugin manager" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.954833 4774 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.954847 4774 server.go:79] "Starting device plugin registration server" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.955278 4774 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.955301 4774 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.955524 4774 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.955649 4774 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.955659 4774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 13:37:08 crc kubenswrapper[4774]: E1001 13:37:08.962307 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.969594 4774 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.969673 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971212 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971613 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.971644 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972404 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972502 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.972530 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.974945 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.975203 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.975268 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.975952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.975976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976118 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976278 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976334 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976789 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.976876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.977156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.977180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.977191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.977315 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.977344 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.978026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.978075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.978097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.979018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.979060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:08 crc kubenswrapper[4774]: I1001 13:37:08.979086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.003824 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="400ms" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.045923 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046084 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046159 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046229 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046379 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046426 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046554 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046580 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046658 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046714 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.046751 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.056191 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.057384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.057418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.057430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.057469 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.057897 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147786 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147845 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147867 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147885 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147905 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147945 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147963 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147982 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.147979 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148001 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148145 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148148 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148219 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148237 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148276 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148293 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148309 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148200 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148164 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148361 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148383 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148404 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.148258 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.258296 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.259568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.259688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.259749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.259813 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.260193 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.311570 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.320880 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.347281 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.368977 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.377415 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.404502 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="800ms" Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.527635 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e407fd3b1c92412c1f9b45674eb5d527c94ce2f3f5b724bb252ba6c1d00f1ce4 WatchSource:0}: Error finding container e407fd3b1c92412c1f9b45674eb5d527c94ce2f3f5b724bb252ba6c1d00f1ce4: Status 404 returned error can't find the container with id e407fd3b1c92412c1f9b45674eb5d527c94ce2f3f5b724bb252ba6c1d00f1ce4 Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.531080 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ff4b21b291fee849c3e498935347c206f1156811ba90ffeb17d78e05d74100ef WatchSource:0}: Error finding container ff4b21b291fee849c3e498935347c206f1156811ba90ffeb17d78e05d74100ef: Status 404 returned error can't find the container with id ff4b21b291fee849c3e498935347c206f1156811ba90ffeb17d78e05d74100ef Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.533369 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a113ddc248535ae4fe6af75f54ba60f5e9a2770b2b9b5128246c1555a26e6124 WatchSource:0}: Error finding container a113ddc248535ae4fe6af75f54ba60f5e9a2770b2b9b5128246c1555a26e6124: Status 404 returned error can't find the container with id a113ddc248535ae4fe6af75f54ba60f5e9a2770b2b9b5128246c1555a26e6124 Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.535199 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-638deab99962226a1f7c748d87b1fd2935ee3d6683a7f1cdedd908326868fa54 WatchSource:0}: Error finding container 638deab99962226a1f7c748d87b1fd2935ee3d6683a7f1cdedd908326868fa54: Status 404 returned error can't find the container with id 638deab99962226a1f7c748d87b1fd2935ee3d6683a7f1cdedd908326868fa54 Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.536562 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-a9a7b26ecf8f928a89ef8117b621e91d62774a16a12662b9264b2f49f3377653 WatchSource:0}: Error finding container a9a7b26ecf8f928a89ef8117b621e91d62774a16a12662b9264b2f49f3377653: Status 404 returned error can't find the container with id a9a7b26ecf8f928a89ef8117b621e91d62774a16a12662b9264b2f49f3377653 Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.661011 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.663253 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.663320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.663337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.663377 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.664035 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.725729 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.725865 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.736257 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.736373 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.799169 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:09 crc kubenswrapper[4774]: W1001 13:37:09.870213 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:09 crc kubenswrapper[4774]: E1001 13:37:09.870313 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.874830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a9a7b26ecf8f928a89ef8117b621e91d62774a16a12662b9264b2f49f3377653"} Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.875904 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"638deab99962226a1f7c748d87b1fd2935ee3d6683a7f1cdedd908326868fa54"} Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.876921 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a113ddc248535ae4fe6af75f54ba60f5e9a2770b2b9b5128246c1555a26e6124"} Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.878073 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e407fd3b1c92412c1f9b45674eb5d527c94ce2f3f5b724bb252ba6c1d00f1ce4"} Oct 01 13:37:09 crc kubenswrapper[4774]: I1001 13:37:09.879721 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ff4b21b291fee849c3e498935347c206f1156811ba90ffeb17d78e05d74100ef"} Oct 01 13:37:10 crc kubenswrapper[4774]: E1001 13:37:10.206582 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="1.6s" Oct 01 13:37:10 crc kubenswrapper[4774]: W1001 13:37:10.375952 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:10 crc kubenswrapper[4774]: E1001 13:37:10.376051 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.465033 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.466791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.466822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.466831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.466849 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:10 crc kubenswrapper[4774]: E1001 13:37:10.467054 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 01 13:37:10 crc kubenswrapper[4774]: I1001 13:37:10.799050 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.799629 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:11 crc kubenswrapper[4774]: E1001 13:37:11.808120 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.180:6443: connect: connection refused" interval="3.2s" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.887027 4774 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f" exitCode=0 Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.887138 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.887278 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889593 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5" exitCode=0 Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889737 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.889810 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891594 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5" exitCode=0 Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891704 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.891773 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.892770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.893112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.893123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.894752 4774 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486" exitCode=0 Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.894900 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.894946 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.895055 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.899685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.906776 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.906836 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.906855 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.906867 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc"} Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.906983 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.908064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.908094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.908105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.915583 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.916137 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Oct 01 13:37:11 crc kubenswrapper[4774]: I1001 13:37:11.916256 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Oct 01 13:37:11 crc kubenswrapper[4774]: E1001 13:37:11.940610 4774 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.180:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a61851da6933f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 13:37:08.797600575 +0000 UTC m=+0.687231212,LastTimestamp:2025-10-01 13:37:08.797600575 +0000 UTC m=+0.687231212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 13:37:11 crc kubenswrapper[4774]: W1001 13:37:11.954310 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:11 crc kubenswrapper[4774]: E1001 13:37:11.954409 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.067329 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.068599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.068639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.068650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.068676 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:12 crc kubenswrapper[4774]: E1001 13:37:12.069215 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.180:6443: connect: connection refused" node="crc" Oct 01 13:37:12 crc kubenswrapper[4774]: W1001 13:37:12.725537 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:12 crc kubenswrapper[4774]: E1001 13:37:12.725661 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.798223 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.914168 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.914251 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.914259 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.914306 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.915664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.915712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.915726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.919163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.919209 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.919226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.919240 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.922939 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828" exitCode=0 Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.923034 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.923069 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.924328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.924352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.924364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.929172 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.929163 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5"} Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.929200 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.930245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.930300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.930314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.931216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.931266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:12 crc kubenswrapper[4774]: I1001 13:37:12.931285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:12 crc kubenswrapper[4774]: W1001 13:37:12.936052 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:12 crc kubenswrapper[4774]: E1001 13:37:12.936152 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:13 crc kubenswrapper[4774]: W1001 13:37:13.148614 4774 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:13 crc kubenswrapper[4774]: E1001 13:37:13.148940 4774 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.180:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.798912 4774 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.180:6443: connect: connection refused Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.929899 4774 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082" exitCode=0 Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.929970 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082"} Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.930039 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.931411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.931502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.931521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.931767 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933517 4774 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9" exitCode=255 Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933596 4774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933607 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933633 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933714 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933711 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9"} Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.933945 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.934947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.935674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:13 crc kubenswrapper[4774]: I1001 13:37:13.936256 4774 scope.go:117] "RemoveContainer" containerID="2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.269598 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.942384 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1"} Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.942507 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2"} Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.942529 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415"} Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.944754 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.946342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39"} Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.946620 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.947368 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.947854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.947914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:14 crc kubenswrapper[4774]: I1001 13:37:14.947937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.269629 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.270900 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.270937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.270952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.270977 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.954333 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.954917 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7"} Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.954995 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5"} Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.955049 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.955055 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:15 crc kubenswrapper[4774]: I1001 13:37:15.956628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.015825 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.016032 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.017502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.017554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.017571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.135519 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.135791 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.139853 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.139908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.139926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.957499 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.957517 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959226 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:16 crc kubenswrapper[4774]: I1001 13:37:16.959296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.159587 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.159795 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.160727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.160771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.160788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.391652 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.391868 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.393392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.393424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.393432 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.700228 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.700749 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.702874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.703067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:18 crc kubenswrapper[4774]: I1001 13:37:18.703196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:18 crc kubenswrapper[4774]: E1001 13:37:18.962489 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:37:19 crc kubenswrapper[4774]: I1001 13:37:19.002863 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 13:37:19 crc kubenswrapper[4774]: I1001 13:37:19.003107 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:19 crc kubenswrapper[4774]: I1001 13:37:19.004656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:19 crc kubenswrapper[4774]: I1001 13:37:19.004744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:19 crc kubenswrapper[4774]: I1001 13:37:19.004763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.392339 4774 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.392512 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.506640 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.506873 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.508286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.508348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.508372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.922616 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.975076 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.976629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.976713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.976738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:21 crc kubenswrapper[4774]: I1001 13:37:21.979941 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:22 crc kubenswrapper[4774]: I1001 13:37:22.979858 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:22 crc kubenswrapper[4774]: I1001 13:37:22.981281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:22 crc kubenswrapper[4774]: I1001 13:37:22.981337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:22 crc kubenswrapper[4774]: I1001 13:37:22.981355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:24 crc kubenswrapper[4774]: I1001 13:37:24.270604 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 13:37:24 crc kubenswrapper[4774]: I1001 13:37:24.270714 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 13:37:24 crc kubenswrapper[4774]: I1001 13:37:24.626538 4774 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 13:37:24 crc kubenswrapper[4774]: I1001 13:37:24.626640 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.209101 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.209704 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.221588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.221643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.221655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.239416 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 13:37:28 crc kubenswrapper[4774]: E1001 13:37:28.962625 4774 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.995579 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.996720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.996771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:28 crc kubenswrapper[4774]: I1001 13:37:28.996785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.276197 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.276396 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.277431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.277497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.277513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.280574 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.626676 4774 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.630360 4774 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632014 4774 trace.go:236] Trace[207816237]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:37:16.714) (total time: 12917ms): Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[207816237]: ---"Objects listed" error: 12917ms (13:37:29.631) Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[207816237]: [12.917544163s] [12.917544163s] END Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632043 4774 trace.go:236] Trace[119823496]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:37:16.012) (total time: 13619ms): Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[119823496]: ---"Objects listed" error: 13619ms (13:37:29.631) Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[119823496]: [13.619161977s] [13.619161977s] END Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632053 4774 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632059 4774 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632379 4774 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632879 4774 trace.go:236] Trace[1044934304]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:37:18.119) (total time: 11512ms): Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[1044934304]: ---"Objects listed" error: 11512ms (13:37:29.632) Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[1044934304]: [11.512876422s] [11.512876422s] END Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.632899 4774 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.634055 4774 trace.go:236] Trace[1179940099]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:37:17.252) (total time: 12381ms): Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[1179940099]: ---"Objects listed" error: 12381ms (13:37:29.634) Oct 01 13:37:29 crc kubenswrapper[4774]: Trace[1179940099]: [12.381595286s] [12.381595286s] END Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.634074 4774 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.797283 4774 apiserver.go:52] "Watching apiserver" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.803827 4774 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.804791 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806012 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.806130 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806193 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806398 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.806442 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806697 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.806751 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.806771 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.809840 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.811805 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.812109 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.812150 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.812118 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.813996 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.814032 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.814065 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.817670 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.839970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.854326 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.862929 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.871900 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.883606 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.894818 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.902323 4774 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.904355 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934062 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934181 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934223 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934261 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934297 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934328 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934358 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934389 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934407 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934420 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934536 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934559 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934578 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934599 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934640 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934662 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934681 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934694 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934730 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934735 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934754 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934824 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934906 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934941 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934976 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935053 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935087 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935121 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935155 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935201 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935276 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935310 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935346 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935379 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935415 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935481 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935528 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935563 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935599 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935639 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935674 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935706 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935741 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935776 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935809 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935841 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935874 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935907 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935940 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935973 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936007 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936090 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936126 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936161 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936194 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936233 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936510 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936619 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936694 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936728 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.934861 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936764 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936812 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936846 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936888 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936921 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936957 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937001 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937036 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937111 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937190 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937226 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937264 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937301 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937336 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937372 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937407 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937538 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937573 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937608 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937651 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937689 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937765 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937869 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937912 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937949 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937990 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938028 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938072 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938194 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938235 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938279 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938316 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938359 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938401 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938442 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938684 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938741 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938785 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938828 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938874 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938917 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938964 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939015 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939051 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939095 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939132 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939207 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939244 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939281 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939318 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939357 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939400 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939436 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939506 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939550 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939589 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939626 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939669 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939705 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939749 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939784 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939820 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939855 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939892 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939928 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939966 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940002 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940038 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940077 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940150 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940384 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940429 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940493 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940529 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940566 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940602 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940644 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940681 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940810 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940859 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940899 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940939 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.940987 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941028 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941066 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941104 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941141 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941178 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941215 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941252 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941291 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941327 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941363 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941443 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941526 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941565 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941612 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941649 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941688 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941724 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941761 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941799 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941838 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941913 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941949 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941985 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942022 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942059 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942103 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942144 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942188 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942227 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942264 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942322 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942377 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942424 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942486 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942526 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942567 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942652 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942696 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942822 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942867 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943148 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943192 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943233 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943279 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943359 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943401 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943443 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943524 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943609 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943858 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943885 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943912 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943938 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943960 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.945167 4774 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.949431 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935020 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954934 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954960 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.955215 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.955488 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.957177 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935061 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935068 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935321 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935331 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935514 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935633 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.935684 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936063 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936098 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936169 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936231 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936572 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936620 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936749 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936816 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.936989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937082 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937238 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937368 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937438 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.937789 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938069 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938336 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938367 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938430 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938675 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.938941 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939002 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939084 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939313 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939531 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.939729 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.941836 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942156 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942219 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942493 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.942497 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943147 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943230 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943869 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.943916 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.944556 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.944805 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.945180 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.945332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.945360 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.944643 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.945784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946129 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946219 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946588 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946621 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946638 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.946940 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.947264 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.947611 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.947903 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.948816 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.948968 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.949523 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.949658 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.949850 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.950149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.950263 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.950533 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.950687 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951077 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951103 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951147 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951554 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951609 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951730 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951865 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.951977 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952006 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952062 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952087 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952165 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952275 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952406 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952494 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952506 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952521 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952612 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952673 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952855 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952903 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.952973 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953014 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953172 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953282 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953076 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953548 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953721 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.953801 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954192 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954417 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954441 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.954665 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.957971 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.957784 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.958304 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.960791 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.963184 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.964313 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.967282 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.970546 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.970817 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.973198 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.973222 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.959046 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.962721 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.975691 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.975713 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.975728 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.976528 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.976741 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:37:30.476699042 +0000 UTC m=+22.366329679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.977003 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.977014 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.976518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.977132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.977355 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.977409 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.977527 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:30.477443084 +0000 UTC m=+22.367073691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.978018 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.978794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.978956 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979550 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979650 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979120 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979415 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979529 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979863 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.979979 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.980260 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.980631 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.980701 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.980821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.981271 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.981528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.981657 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.981802 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.978054 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.981844 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.982043 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.982229 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.982271 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.982675 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.982916 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.983117 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.983634 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.983811 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.984042 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.984003 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.984170 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.985116 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.985497 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.985814 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.985843 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.986250 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.986391 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.986398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.987026 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.987562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.987675 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.987872 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.987886 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988236 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988379 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988368 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988573 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.988768 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.988872 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:30.478139704 +0000 UTC m=+22.367770481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.989031 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.989252 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:30.489184895 +0000 UTC m=+22.378815622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:29 crc kubenswrapper[4774]: E1001 13:37:29.989291 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:30.489280508 +0000 UTC m=+22.378911115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.992737 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:29 crc kubenswrapper[4774]: I1001 13:37:29.996866 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.001233 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.001629 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.002087 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.002313 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.002391 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.002558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.002634 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.003073 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.004620 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.005863 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.009196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.010062 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.010543 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.010545 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.011712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.012902 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.020221 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.021320 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.024987 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.026045 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.034979 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.043572 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044827 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044900 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044915 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044926 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044938 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044952 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044963 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044975 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044986 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.044998 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045011 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045021 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045029 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045038 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045049 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045061 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045110 4774 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045121 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045182 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045247 4774 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045275 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045296 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045318 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045340 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045361 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045380 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045399 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045418 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045495 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045525 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045549 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045573 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045602 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045628 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045654 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045679 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045702 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045725 4774 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045748 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045772 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045796 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045822 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045852 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045877 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045896 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045915 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045933 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045952 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045973 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.045993 4774 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046014 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046037 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046058 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046079 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046100 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046131 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046151 4774 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046172 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046190 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046208 4774 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046227 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046245 4774 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046264 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046282 4774 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046302 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046319 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046337 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046355 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046375 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046392 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046419 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046438 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046487 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046506 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046525 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046544 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046565 4774 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046584 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046604 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046623 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046643 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046660 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046681 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046699 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046717 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046735 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046754 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046772 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046790 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046809 4774 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046827 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046845 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046865 4774 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046883 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046902 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046922 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046940 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046957 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046975 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.046993 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047011 4774 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047029 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047047 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047066 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047085 4774 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047104 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047123 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047142 4774 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047164 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047190 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047216 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047240 4774 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047265 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047288 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047310 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047334 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047359 4774 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047385 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047408 4774 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047432 4774 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047514 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047542 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047560 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047577 4774 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047597 4774 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047614 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047633 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047652 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047671 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047692 4774 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047711 4774 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047729 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047747 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047766 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047784 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047801 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047820 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047838 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047856 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047874 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047893 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047912 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047930 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047949 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047966 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.047984 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048003 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048021 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048039 4774 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048061 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048078 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048095 4774 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048114 4774 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048133 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048151 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048170 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048188 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048206 4774 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048225 4774 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048243 4774 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048260 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048280 4774 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048299 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048317 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048335 4774 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048353 4774 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048371 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048389 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048408 4774 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048427 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048445 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048511 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048530 4774 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048549 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048569 4774 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048589 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048607 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048627 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048645 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048664 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048684 4774 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048702 4774 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048721 4774 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048738 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048756 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048775 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048793 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048813 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048831 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048849 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.048869 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.051020 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.122841 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.134432 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.140742 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:37:30 crc kubenswrapper[4774]: W1001 13:37:30.147724 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c81d85e6673512ea2ad113defb1f1df4125fe820408f1395c8588c6b5ce2e99b WatchSource:0}: Error finding container c81d85e6673512ea2ad113defb1f1df4125fe820408f1395c8588c6b5ce2e99b: Status 404 returned error can't find the container with id c81d85e6673512ea2ad113defb1f1df4125fe820408f1395c8588c6b5ce2e99b Oct 01 13:37:30 crc kubenswrapper[4774]: W1001 13:37:30.154923 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-57bad402ec6f14ebb9fe066ed496b6cd8e23250b074849552179c0ed361ce9bb WatchSource:0}: Error finding container 57bad402ec6f14ebb9fe066ed496b6cd8e23250b074849552179c0ed361ce9bb: Status 404 returned error can't find the container with id 57bad402ec6f14ebb9fe066ed496b6cd8e23250b074849552179c0ed361ce9bb Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.232819 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.253252 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.270580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.285826 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.303891 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.314946 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.327712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.342979 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.551724 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:37:31.551705965 +0000 UTC m=+23.441336562 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.551742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.551864 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.551899 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.551929 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.551983 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.551995 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552034 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:31.552026914 +0000 UTC m=+23.441657511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552102 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552149 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552182 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552196 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552159 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:31.552142628 +0000 UTC m=+23.441773235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552282 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:31.552261331 +0000 UTC m=+23.441891938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552363 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552378 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552388 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.552425 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:31.552416376 +0000 UTC m=+23.442046983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.672494 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.679227 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.685794 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.689898 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.696597 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.708375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.718951 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.732507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.743687 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.756855 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.774994 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.786128 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.798474 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.819893 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.834938 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.846243 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.857659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.869808 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:30 crc kubenswrapper[4774]: E1001 13:37:30.870019 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.876838 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.877512 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.878407 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.878574 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.879140 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.880049 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.880527 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.881071 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.881934 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.882507 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.883353 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.883823 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.884857 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.885556 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.886034 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.887028 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.887529 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.888375 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.888748 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.889269 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.890206 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.890660 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.891649 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.892087 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.893172 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.893570 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.894125 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.895151 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.895729 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.896737 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.897288 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.898271 4774 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.898382 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.900294 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.901444 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.901863 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.903422 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.904081 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.905012 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.905747 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.906773 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.907259 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.908285 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.909080 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.910195 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.910890 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.911982 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.912601 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.913670 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.914121 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.915008 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.915444 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.916281 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.916828 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.917346 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.964834 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-679cg"] Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.965599 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.968319 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.968330 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.968922 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.978728 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:30 crc kubenswrapper[4774]: I1001 13:37:30.991323 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.002144 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.009225 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"57bad402ec6f14ebb9fe066ed496b6cd8e23250b074849552179c0ed361ce9bb"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.010947 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.011001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.011014 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c81d85e6673512ea2ad113defb1f1df4125fe820408f1395c8588c6b5ce2e99b"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.013234 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.013272 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3ae633d5f78a040aa84b69571e94194e578526bdd97a378cfc9d6510daca4eef"} Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.016117 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.030188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.047847 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.056518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8660d853-6a54-48f4-a6fd-275176a4bf1d-hosts-file\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.056593 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcfkx\" (UniqueName: \"kubernetes.io/projected/8660d853-6a54-48f4-a6fd-275176a4bf1d-kube-api-access-zcfkx\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.066363 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.088694 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.105746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.122021 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.150647 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.157824 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8660d853-6a54-48f4-a6fd-275176a4bf1d-hosts-file\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.157914 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcfkx\" (UniqueName: \"kubernetes.io/projected/8660d853-6a54-48f4-a6fd-275176a4bf1d-kube-api-access-zcfkx\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.158419 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8660d853-6a54-48f4-a6fd-275176a4bf1d-hosts-file\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.181065 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.192542 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcfkx\" (UniqueName: \"kubernetes.io/projected/8660d853-6a54-48f4-a6fd-275176a4bf1d-kube-api-access-zcfkx\") pod \"node-resolver-679cg\" (UID: \"8660d853-6a54-48f4-a6fd-275176a4bf1d\") " pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.223642 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.237026 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.254624 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.271659 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.288021 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.288495 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-679cg" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.302094 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.562465 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.562530 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.562552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.562573 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.562592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562671 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562715 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:33.562701917 +0000 UTC m=+25.452332514 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562810 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562867 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562892 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.562986 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:33.562954985 +0000 UTC m=+25.452585682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563031 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:37:33.563024107 +0000 UTC m=+25.452654704 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563036 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563082 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:33.563072068 +0000 UTC m=+25.452702665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563224 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563283 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563307 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.563396 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:33.563368637 +0000 UTC m=+25.452999274 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.786603 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-74ttd"] Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.786930 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.788181 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.789700 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.789709 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.789800 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.790969 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.796865 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.805239 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.812806 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.820234 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.829376 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.838244 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.845815 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.851696 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.859637 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.865673 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18618ab0-7244-42b3-9ccd-60661c89c742-proxy-tls\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.865727 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18618ab0-7244-42b3-9ccd-60661c89c742-rootfs\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.865744 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmqb\" (UniqueName: \"kubernetes.io/projected/18618ab0-7244-42b3-9ccd-60661c89c742-kube-api-access-cjmqb\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.865763 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18618ab0-7244-42b3-9ccd-60661c89c742-mcd-auth-proxy-config\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.869655 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.869823 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.869671 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:31 crc kubenswrapper[4774]: E1001 13:37:31.870014 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.871931 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.966249 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18618ab0-7244-42b3-9ccd-60661c89c742-rootfs\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.966608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18618ab0-7244-42b3-9ccd-60661c89c742-mcd-auth-proxy-config\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.966632 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmqb\" (UniqueName: \"kubernetes.io/projected/18618ab0-7244-42b3-9ccd-60661c89c742-kube-api-access-cjmqb\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.966665 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18618ab0-7244-42b3-9ccd-60661c89c742-proxy-tls\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.966376 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/18618ab0-7244-42b3-9ccd-60661c89c742-rootfs\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.967508 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/18618ab0-7244-42b3-9ccd-60661c89c742-mcd-auth-proxy-config\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.977209 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18618ab0-7244-42b3-9ccd-60661c89c742-proxy-tls\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:31 crc kubenswrapper[4774]: I1001 13:37:31.991888 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmqb\" (UniqueName: \"kubernetes.io/projected/18618ab0-7244-42b3-9ccd-60661c89c742-kube-api-access-cjmqb\") pod \"machine-config-daemon-74ttd\" (UID: \"18618ab0-7244-42b3-9ccd-60661c89c742\") " pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.017263 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-679cg" event={"ID":"8660d853-6a54-48f4-a6fd-275176a4bf1d","Type":"ContainerStarted","Data":"3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54"} Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.017316 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-679cg" event={"ID":"8660d853-6a54-48f4-a6fd-275176a4bf1d","Type":"ContainerStarted","Data":"e74187b3c38b481727e56d52413b1099d0558d71b8790292a0b6f416d02e48cc"} Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.033614 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.046227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.062392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.078723 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.092771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.097968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.111127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: W1001 13:37:32.116058 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18618ab0_7244_42b3_9ccd_60661c89c742.slice/crio-67b3ce19736f25d4336d788dfd046ad433c140b2e5732266207107c1054b9cfe WatchSource:0}: Error finding container 67b3ce19736f25d4336d788dfd046ad433c140b2e5732266207107c1054b9cfe: Status 404 returned error can't find the container with id 67b3ce19736f25d4336d788dfd046ad433c140b2e5732266207107c1054b9cfe Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.135975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.150021 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8svls"] Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.150496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.150646 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-h5t2l"] Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.164653 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.165159 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.166215 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.166223 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.166648 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.173630 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.170748 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.178667 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.185866 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.218768 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.236660 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.252273 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.269957 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270102 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270133 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-os-release\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270156 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-netns\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270177 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-daemon-config\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270222 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-k8s-cni-cncf-io\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270266 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-socket-dir-parent\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270286 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-system-cni-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270306 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cnibin\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-os-release\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270382 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-multus\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-hostroot\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270432 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-system-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-etc-kubernetes\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270494 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-bin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270519 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cnibin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-conf-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270630 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270742 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhsrc\" (UniqueName: \"kubernetes.io/projected/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-kube-api-access-dhsrc\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270782 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-kubelet\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270817 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cni-binary-copy\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270833 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-multus-certs\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.270848 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmmmz\" (UniqueName: \"kubernetes.io/projected/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-kube-api-access-kmmmz\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.284073 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.297185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.312215 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.323975 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.334704 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.347864 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.362151 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371583 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-socket-dir-parent\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371623 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-system-cni-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371651 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cnibin\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-os-release\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371688 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-multus\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371704 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-hostroot\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371720 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-system-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371734 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-etc-kubernetes\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371748 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-bin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371765 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cnibin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371784 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-system-cni-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371847 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-conf-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371863 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-system-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371868 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-hostroot\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371905 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-bin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371911 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-etc-kubernetes\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371886 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-socket-dir-parent\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371931 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-cni-multus\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371780 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-conf-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371944 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cnibin\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371952 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-os-release\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371967 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.371784 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cnibin\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372027 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhsrc\" (UniqueName: \"kubernetes.io/projected/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-kube-api-access-dhsrc\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372047 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-kubelet\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372072 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cni-binary-copy\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-multus-certs\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372113 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-var-lib-kubelet\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372113 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmmmz\" (UniqueName: \"kubernetes.io/projected/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-kube-api-access-kmmmz\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372179 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-os-release\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372195 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-netns\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372210 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-daemon-config\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372246 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-k8s-cni-cncf-io\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372264 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372432 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-os-release\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372494 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-multus-certs\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372762 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-cni-dir\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-cni-binary-copy\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372830 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-netns\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372912 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.372957 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-host-run-k8s-cni-cncf-io\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.373243 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-cni-binary-copy\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.373344 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-multus-daemon-config\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.375618 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.377391 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-tuning-conf-dir\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.392471 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.394375 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmmmz\" (UniqueName: \"kubernetes.io/projected/be8a0f8f-0098-4fa6-b4b2-ceda580f19b5-kube-api-access-kmmmz\") pod \"multus-8svls\" (UID: \"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\") " pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.395007 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhsrc\" (UniqueName: \"kubernetes.io/projected/f6c2cbe4-dd67-4f5c-8f47-3d8986219793-kube-api-access-dhsrc\") pod \"multus-additional-cni-plugins-h5t2l\" (UID: \"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\") " pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.403653 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.494415 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svls" Oct 01 13:37:32 crc kubenswrapper[4774]: W1001 13:37:32.506620 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe8a0f8f_0098_4fa6_b4b2_ceda580f19b5.slice/crio-0de9dd3081a2adbdd58d511368422445f9227e2ca324ccffffe2b5b0b178b1e3 WatchSource:0}: Error finding container 0de9dd3081a2adbdd58d511368422445f9227e2ca324ccffffe2b5b0b178b1e3: Status 404 returned error can't find the container with id 0de9dd3081a2adbdd58d511368422445f9227e2ca324ccffffe2b5b0b178b1e3 Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.511892 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.542448 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v7jfr"] Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.543363 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.545315 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.545649 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.545654 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.546162 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.546509 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.546548 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.546753 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 13:37:32 crc kubenswrapper[4774]: W1001 13:37:32.547048 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c2cbe4_dd67_4f5c_8f47_3d8986219793.slice/crio-ca996f330732db6541387651aa1f9e3c8bd24309596e572288126e7d4bc41fbd WatchSource:0}: Error finding container ca996f330732db6541387651aa1f9e3c8bd24309596e572288126e7d4bc41fbd: Status 404 returned error can't find the container with id ca996f330732db6541387651aa1f9e3c8bd24309596e572288126e7d4bc41fbd Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.557580 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.570396 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.584517 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.605084 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.629425 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.646177 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.658643 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674767 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674804 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674837 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674851 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674865 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674880 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.674993 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675042 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675062 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675105 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675125 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675174 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675189 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675230 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7th\" (UniqueName: \"kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675310 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675329 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675366 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675384 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.675148 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.689068 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.700568 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.711537 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.722421 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.737873 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776766 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776811 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776832 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776849 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776865 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776887 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776907 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776921 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776937 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776953 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776969 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776985 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776990 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777008 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777028 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777043 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777057 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777092 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777110 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7th\" (UniqueName: \"kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777133 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777170 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777200 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777180 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777235 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777256 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777279 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777301 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777542 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777570 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776928 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.776969 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777715 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.777922 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.778050 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.778707 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.780377 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.796601 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7th\" (UniqueName: \"kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th\") pod \"ovnkube-node-v7jfr\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.865223 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:32 crc kubenswrapper[4774]: I1001 13:37:32.869963 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:32 crc kubenswrapper[4774]: E1001 13:37:32.870070 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:32 crc kubenswrapper[4774]: W1001 13:37:32.877408 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ee3cb3_6187_468f_9b58_60a18ef2da67.slice/crio-a3b86086d7245e27b984358968c2142debea2cd6b6c0209b0196de96dca863d0 WatchSource:0}: Error finding container a3b86086d7245e27b984358968c2142debea2cd6b6c0209b0196de96dca863d0: Status 404 returned error can't find the container with id a3b86086d7245e27b984358968c2142debea2cd6b6c0209b0196de96dca863d0 Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.021853 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.021898 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"ca996f330732db6541387651aa1f9e3c8bd24309596e572288126e7d4bc41fbd"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.023496 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerStarted","Data":"2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.023597 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerStarted","Data":"0de9dd3081a2adbdd58d511368422445f9227e2ca324ccffffe2b5b0b178b1e3"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.025263 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.026681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.026750 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"a3b86086d7245e27b984358968c2142debea2cd6b6c0209b0196de96dca863d0"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.028286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.028339 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.028364 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"67b3ce19736f25d4336d788dfd046ad433c140b2e5732266207107c1054b9cfe"} Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.036516 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.050301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.065225 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.083229 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.101168 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.115807 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.136348 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.148812 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.162281 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.177283 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.194943 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.206309 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.222921 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.240047 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.260213 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.281495 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.313531 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.352626 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.374386 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.389242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.403080 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.413206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.423615 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.444493 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.456591 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.466828 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.585858 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.585943 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.585976 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.586005 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.586030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586117 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586130 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586154 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586168 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586167 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586214 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586222 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586231 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586180 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:37.586163454 +0000 UTC m=+29.475794051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586619 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:37.586589216 +0000 UTC m=+29.476219873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586647 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:37:37.586636938 +0000 UTC m=+29.476267545 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586668 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:37.586661308 +0000 UTC m=+29.476291915 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.586686 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:37.586679999 +0000 UTC m=+29.476310606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.869663 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:33 crc kubenswrapper[4774]: I1001 13:37:33.869779 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.869854 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:33 crc kubenswrapper[4774]: E1001 13:37:33.869983 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.032974 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" exitCode=0 Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.033040 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.035946 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece" exitCode=0 Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.036033 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece"} Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.050590 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.066840 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.098070 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.121185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.135284 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.146134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.162623 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.175439 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.186720 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.199925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.211216 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.225312 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.236022 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.249942 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.261348 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.273160 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.287198 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.304242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.323850 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.336394 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.350403 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.366210 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.380409 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.392339 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.410431 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.446173 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:34 crc kubenswrapper[4774]: I1001 13:37:34.869712 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:34 crc kubenswrapper[4774]: E1001 13:37:34.870137 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.046066 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.046170 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.046192 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.048320 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf"} Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.064813 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.079310 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.092788 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.107902 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.120379 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.136003 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.150730 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.163359 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.178801 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.193740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.222189 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.235780 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.247490 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:35Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.869514 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:35 crc kubenswrapper[4774]: I1001 13:37:35.869569 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:35 crc kubenswrapper[4774]: E1001 13:37:35.870000 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:35 crc kubenswrapper[4774]: E1001 13:37:35.870125 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.030550 4774 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.032470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.032524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.032542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.032690 4774 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.041633 4774 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.042049 4774 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.043685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.043736 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.043752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.043774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.043793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.054243 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.054300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.054317 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.069548 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.074474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.074513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.074523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.074541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.074554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.089145 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.092493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.092530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.092542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.092559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.092570 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.114936 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.118286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.118318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.118327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.118341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.118350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.137658 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.141838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.141869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.141882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.141898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.141908 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.161866 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.162092 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.163361 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.163388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.163396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.163410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.163421 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.266244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.266483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.266647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.266791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.266933 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.370120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.370172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.370187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.370208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.370225 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.472938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.472998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.473015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.473040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.473059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.576390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.576492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.576540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.576564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.576581 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.679848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.679922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.679942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.679989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.680011 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.783025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.783091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.783109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.783133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.783152 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.869772 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:36 crc kubenswrapper[4774]: E1001 13:37:36.870012 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.885772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.885840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.885860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.885881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.885900 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.988158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.988240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.988286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.988306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:36 crc kubenswrapper[4774]: I1001 13:37:36.988318 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:36Z","lastTransitionTime":"2025-10-01T13:37:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.060847 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf" exitCode=0 Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.060913 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.082994 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.092504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.092762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.092786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.092814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.092834 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.100045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.124945 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.141514 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.155814 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.168185 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.180494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.193422 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.194844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.194878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.194890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.194909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.194924 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.209634 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.228007 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.247353 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.261307 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.275428 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.297373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.297798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.297817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.297838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.297853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.399348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.399392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.399409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.399430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.399473 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.502528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.502588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.502605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.502627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.502641 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.605854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.605917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.605934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.605958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.605978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.634549 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.634733 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.634783 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:37:45.634743236 +0000 UTC m=+37.524373863 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.634847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.634925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.634950 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.634999 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635013 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635034 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635120 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:45.635093537 +0000 UTC m=+37.524724164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635173 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635202 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635227 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635296 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:45.635276732 +0000 UTC m=+37.524907469 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635383 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635423 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:45.635411006 +0000 UTC m=+37.525041643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635521 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.635565 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:37:45.63555362 +0000 UTC m=+37.525184257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.709941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.710007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.710026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.710054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.710073 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.813174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.813217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.813234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.813258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.813276 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.870313 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.870329 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.870507 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:37 crc kubenswrapper[4774]: E1001 13:37:37.870678 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.916544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.916604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.916628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.916658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:37 crc kubenswrapper[4774]: I1001 13:37:37.916681 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:37Z","lastTransitionTime":"2025-10-01T13:37:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.020208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.020267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.020284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.020308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.020326 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.072107 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.076184 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53" exitCode=0 Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.076229 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.105215 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.124234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.124289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.124306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.124330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.124350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.132632 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.155727 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.170366 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.186432 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.198620 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.210335 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.224919 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.226722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.226753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.226764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.226779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.226791 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.240001 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.259042 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.273340 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.283537 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.294364 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.329244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.329271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.329278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.329290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.329299 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.432020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.432060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.432071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.432085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.432096 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.535546 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.535609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.535627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.535651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.535669 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.638381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.638423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.638435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.638473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.638485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.741892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.741959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.741977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.742004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.742023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.845020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.845079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.845096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.845118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.845136 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.869610 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:38 crc kubenswrapper[4774]: E1001 13:37:38.869777 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.887896 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.912884 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.934421 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.948216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.948287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.948312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.948342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.948364 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:38Z","lastTransitionTime":"2025-10-01T13:37:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.949993 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.974713 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:38 crc kubenswrapper[4774]: I1001 13:37:38.997741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.028052 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.048667 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.057295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.057352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.057364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.057382 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.057395 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.071103 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.083103 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.090007 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.103882 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.120436 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.133692 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.159769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.159825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.159861 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.159882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.159896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.262713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.262856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.262918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.262984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.263056 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.366434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.366528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.366556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.366587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.366611 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.469271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.469656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.469855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.470076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.470329 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.574010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.574067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.574087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.574109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.574126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.677001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.677242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.677300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.677371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.677447 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.780061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.780102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.780114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.780151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.780166 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.869692 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:39 crc kubenswrapper[4774]: E1001 13:37:39.869814 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.869889 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:39 crc kubenswrapper[4774]: E1001 13:37:39.869958 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.882418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.882533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.882562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.882593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.882616 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.985741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.985793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.985810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.985836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:39 crc kubenswrapper[4774]: I1001 13:37:39.985853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:39Z","lastTransitionTime":"2025-10-01T13:37:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.091093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.091326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.091350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.091379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.091401 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.092396 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2" exitCode=0 Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.092512 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.116066 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.135771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.158790 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.173586 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.186803 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.194770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.194822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.194840 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.194865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.194882 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.204637 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.220159 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.238442 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.256118 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.273845 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.300350 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.316862 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.316914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.316926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.316944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.316955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.339864 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.357794 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:40Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.419694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.419734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.419745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.419761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.419772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.523720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.523817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.523836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.523862 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.523880 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.628881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.628943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.628961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.628987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.629005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.733056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.733115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.733131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.733153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.733169 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.836768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.837077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.837091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.837108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.837121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.871493 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:40 crc kubenswrapper[4774]: E1001 13:37:40.871707 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.940289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.940348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.940364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.940390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:40 crc kubenswrapper[4774]: I1001 13:37:40.940411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:40Z","lastTransitionTime":"2025-10-01T13:37:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.044052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.045230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.045369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.045645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.045777 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.102825 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.103037 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.103082 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.103243 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.109518 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.129812 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.141122 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.142336 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.148393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.148477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.148496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.148520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.148536 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.154701 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.174227 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.191686 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.209805 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.228986 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.247133 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.251123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.251183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.251198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.251224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.251240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.263127 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.280244 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.294636 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.311365 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.328240 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.349761 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.353272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.353303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.353314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.353332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.353343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.365097 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.383323 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.400573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.418770 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.442988 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.455960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.456019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.456036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.456056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.456071 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.458385 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.474884 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.493753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.511083 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.529240 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.545644 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.556928 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.558037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.558072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.558082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.558099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.558111 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.578062 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.660389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.660420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.660430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.660445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.660480 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.763187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.763222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.763230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.763243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.763254 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.865821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.865876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.865891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.865910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.865924 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.869538 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.869565 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:41 crc kubenswrapper[4774]: E1001 13:37:41.869689 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:41 crc kubenswrapper[4774]: E1001 13:37:41.869770 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.968240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.968282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.968293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.968310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:41 crc kubenswrapper[4774]: I1001 13:37:41.968321 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:41Z","lastTransitionTime":"2025-10-01T13:37:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.071680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.071750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.071771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.071795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.071813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.174410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.174518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.174539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.174563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.174580 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.277216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.277265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.277281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.277304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.277320 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.380943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.381049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.381068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.381130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.381150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.484105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.484157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.484169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.484186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.484198 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.586901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.586972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.586992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.587017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.587036 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.689516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.690023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.690042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.690067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.690087 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.794656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.794719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.794737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.794761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.794778 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.871974 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:42 crc kubenswrapper[4774]: E1001 13:37:42.872065 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.897104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.897127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.897135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.897150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:42 crc kubenswrapper[4774]: I1001 13:37:42.897157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:42Z","lastTransitionTime":"2025-10-01T13:37:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.000052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.000155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.000183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.000666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.000728 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.103700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.103761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.103778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.103803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.103821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.125023 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe" exitCode=0 Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.125106 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.153964 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.171680 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.184083 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.198152 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.206362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.206393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.206402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.206418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.206430 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.216589 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.232665 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.245955 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.258759 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.271087 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.289602 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.308875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.308925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.308943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.308966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.308983 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.310616 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.327212 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.344460 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:43Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.411562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.411607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.411619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.411636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.411648 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.514599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.514643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.514657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.514678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.514693 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.618268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.618334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.618393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.618423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.618443 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.721814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.721867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.721884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.721908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.721925 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.824750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.824813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.824833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.824858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.824875 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.869503 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.869541 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:43 crc kubenswrapper[4774]: E1001 13:37:43.869697 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:43 crc kubenswrapper[4774]: E1001 13:37:43.869870 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.928324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.928385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.928402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.928425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:43 crc kubenswrapper[4774]: I1001 13:37:43.928441 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:43Z","lastTransitionTime":"2025-10-01T13:37:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.031945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.032007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.032031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.032056 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.032075 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.134360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.134413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.134429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.134484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.134509 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.136505 4774 generic.go:334] "Generic (PLEG): container finished" podID="f6c2cbe4-dd67-4f5c-8f47-3d8986219793" containerID="46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e" exitCode=0 Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.136562 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerDied","Data":"46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.161316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.183139 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.205419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.232092 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.238129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.238169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.238180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.238208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.238232 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.256774 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.277869 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.295710 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.314645 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.335387 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.335919 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld"] Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.336803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.339512 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341048 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.341149 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.366482 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.380769 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.394991 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.409897 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.426636 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.444255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.444688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.444930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.445185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.445914 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.445572 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.467157 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.480482 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.494139 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.504515 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55a90eeb-9a46-4083-9c5e-4313773da697-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.504558 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.504608 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k75h\" (UniqueName: \"kubernetes.io/projected/55a90eeb-9a46-4083-9c5e-4313773da697-kube-api-access-9k75h\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.504630 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.512272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.526275 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.539281 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.548685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.549018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.549161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.549319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.549481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.556256 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.569344 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.589811 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.605959 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.606286 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55a90eeb-9a46-4083-9c5e-4313773da697-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.606308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.606353 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k75h\" (UniqueName: \"kubernetes.io/projected/55a90eeb-9a46-4083-9c5e-4313773da697-kube-api-access-9k75h\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.606368 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.607199 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.607479 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55a90eeb-9a46-4083-9c5e-4313773da697-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.614041 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55a90eeb-9a46-4083-9c5e-4313773da697-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.626773 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.635550 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k75h\" (UniqueName: \"kubernetes.io/projected/55a90eeb-9a46-4083-9c5e-4313773da697-kube-api-access-9k75h\") pod \"ovnkube-control-plane-749d76644c-hvkld\" (UID: \"55a90eeb-9a46-4083-9c5e-4313773da697\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.639315 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:44Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.651024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.651972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.651991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.651998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.652010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.652019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.754844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.754897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.754913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.754936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.754954 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.857395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.857421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.857429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.857441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.857467 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.869784 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:44 crc kubenswrapper[4774]: E1001 13:37:44.869950 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.961590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.961666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.961690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.961718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:44 crc kubenswrapper[4774]: I1001 13:37:44.961740 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:44Z","lastTransitionTime":"2025-10-01T13:37:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.067352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.067410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.067427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.067487 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.067524 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.141498 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" event={"ID":"55a90eeb-9a46-4083-9c5e-4313773da697","Type":"ContainerStarted","Data":"5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.141548 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" event={"ID":"55a90eeb-9a46-4083-9c5e-4313773da697","Type":"ContainerStarted","Data":"39668e9004fdd8606221873ecf57e7d09cd49661d227987c86483f9bae751503"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.145379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" event={"ID":"f6c2cbe4-dd67-4f5c-8f47-3d8986219793","Type":"ContainerStarted","Data":"8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.163544 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.170687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.170730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.170740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.170755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.170847 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.175098 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.187918 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.200795 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.212203 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.224502 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.237862 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.254487 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.270673 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.273526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.273560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.273570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.273585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.273594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.285095 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.296286 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.321912 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.339319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.356510 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.376494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.376544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.376556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.376575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.376586 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.459774 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-96g6w"] Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.460118 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.462484 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.462517 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.463138 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.463644 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.473605 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.478008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.478051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.478062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.478080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.478090 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.485579 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.499236 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.511611 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.530063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.546794 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.560296 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.577351 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.581641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.581835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.581946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.582074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.582178 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.591283 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.604930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.615971 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.618846 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/843edfe0-a47c-4ef9-9ec3-938d1605d348-host\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.618904 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/843edfe0-a47c-4ef9-9ec3-938d1605d348-serviceca\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.618946 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m555q\" (UniqueName: \"kubernetes.io/projected/843edfe0-a47c-4ef9-9ec3-938d1605d348-kube-api-access-m555q\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.629246 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.640401 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.648704 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.661238 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.685320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.685364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.685372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.685384 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.685393 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720256 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720385 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.720367636 +0000 UTC m=+53.609998233 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720522 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720613 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720630 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720688 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720700 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720729 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/843edfe0-a47c-4ef9-9ec3-938d1605d348-host\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720672 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/843edfe0-a47c-4ef9-9ec3-938d1605d348-host\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720732 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.720723306 +0000 UTC m=+53.610353903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720667 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720888 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720914 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720933 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.720948 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/843edfe0-a47c-4ef9-9ec3-938d1605d348-serviceca\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.720983 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.720966363 +0000 UTC m=+53.610596990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.721010 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.721036 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.721050 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m555q\" (UniqueName: \"kubernetes.io/projected/843edfe0-a47c-4ef9-9ec3-938d1605d348-kube-api-access-m555q\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.721082 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.721059946 +0000 UTC m=+53.610690543 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.721121 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.721148 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.721141648 +0000 UTC m=+53.610772246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.722372 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/843edfe0-a47c-4ef9-9ec3-938d1605d348-serviceca\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.739171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m555q\" (UniqueName: \"kubernetes.io/projected/843edfe0-a47c-4ef9-9ec3-938d1605d348-kube-api-access-m555q\") pod \"node-ca-96g6w\" (UID: \"843edfe0-a47c-4ef9-9ec3-938d1605d348\") " pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.771103 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-96g6w" Oct 01 13:37:45 crc kubenswrapper[4774]: W1001 13:37:45.784031 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod843edfe0_a47c_4ef9_9ec3_938d1605d348.slice/crio-d9babc51c00c4f77614427baa80357f78d80ead11515cf1a0cee040e903964b9 WatchSource:0}: Error finding container d9babc51c00c4f77614427baa80357f78d80ead11515cf1a0cee040e903964b9: Status 404 returned error can't find the container with id d9babc51c00c4f77614427baa80357f78d80ead11515cf1a0cee040e903964b9 Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.788059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.788105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.788120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.788140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.788156 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.825144 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-hgfsz"] Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.825674 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.825741 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.845855 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.857662 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.869923 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.870024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.870084 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:45 crc kubenswrapper[4774]: E1001 13:37:45.870270 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.872090 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.886635 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.890051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.890080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.890088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.890101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.890109 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.907895 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.922087 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.922842 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l88m2\" (UniqueName: \"kubernetes.io/projected/67555194-dc73-4f0a-bd6e-1ae0a010067a-kube-api-access-l88m2\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.922995 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.935202 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.949861 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.963334 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.979218 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.991006 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:45Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.996475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.996512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.996524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.996542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:45 crc kubenswrapper[4774]: I1001 13:37:45.996554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:45Z","lastTransitionTime":"2025-10-01T13:37:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.008534 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.025399 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.025651 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l88m2\" (UniqueName: \"kubernetes.io/projected/67555194-dc73-4f0a-bd6e-1ae0a010067a-kube-api-access-l88m2\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.026038 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.026134 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:37:46.526121948 +0000 UTC m=+38.415752545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.040889 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.067416 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.078715 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l88m2\" (UniqueName: \"kubernetes.io/projected/67555194-dc73-4f0a-bd6e-1ae0a010067a-kube-api-access-l88m2\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.085118 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.095319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.099124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.099324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.099404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.099504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.099582 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.148809 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-96g6w" event={"ID":"843edfe0-a47c-4ef9-9ec3-938d1605d348","Type":"ContainerStarted","Data":"d9babc51c00c4f77614427baa80357f78d80ead11515cf1a0cee040e903964b9"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.152047 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" event={"ID":"55a90eeb-9a46-4083-9c5e-4313773da697","Type":"ContainerStarted","Data":"49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.165358 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.176815 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.190187 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202303 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.202300 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.212544 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.222096 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.235011 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.254946 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.267267 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.281169 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.294569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.304916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.304941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.304949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.304961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.304969 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.308419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.318328 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.332090 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.345032 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.361570 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.407445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.407502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.407512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.407528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.407539 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.510864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.511110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.511238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.511360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.511466 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.530677 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.530849 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.530943 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:37:47.530919578 +0000 UTC m=+39.420550215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.547800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.547848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.547917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.547971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.547984 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.560488 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.564522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.564591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.564611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.564636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.564654 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.579616 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.583654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.583700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.583712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.583729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.583742 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.597327 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.601984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.602032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.602050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.602070 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.602088 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.618093 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.623203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.623388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.623551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.623729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.623875 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.643349 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:46Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.643549 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.645653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.645677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.645685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.645699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.645710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.748359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.748427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.748483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.748518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.748540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.851020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.851077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.851093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.851116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.851133 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.870694 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:46 crc kubenswrapper[4774]: E1001 13:37:46.870928 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.955059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.955139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.955160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.955193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:46 crc kubenswrapper[4774]: I1001 13:37:46.955216 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:46Z","lastTransitionTime":"2025-10-01T13:37:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.059327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.059406 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.059423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.059491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.059516 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.162602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.162663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.162680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.162707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.162729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.164776 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-96g6w" event={"ID":"843edfe0-a47c-4ef9-9ec3-938d1605d348","Type":"ContainerStarted","Data":"8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.167675 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/0.log" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.173112 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44" exitCode=1 Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.173133 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.174428 4774 scope.go:117] "RemoveContainer" containerID="885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.196978 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.221860 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.243923 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.265315 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.267499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.267555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.267577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.267605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.267627 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.282844 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.291845 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.302739 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.314208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.326742 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.337132 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.347277 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.355437 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.366749 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.369986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.370019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.370029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.370045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.370055 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.381999 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.396206 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.407916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.422125 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.433826 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.443803 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.461364 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.472625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.472647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.472655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.472668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.472677 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.473683 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.485080 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.498664 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.524096 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.538138 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.542094 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:47 crc kubenswrapper[4774]: E1001 13:37:47.542300 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:47 crc kubenswrapper[4774]: E1001 13:37:47.542395 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:37:49.542373944 +0000 UTC m=+41.432004561 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.550600 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.565158 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.575567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.575608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.575622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.575640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.575651 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.585930 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"eflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121151 5985 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121312 5985 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.121718 5985 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.122014 5985 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122118 5985 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122228 5985 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122685 5985 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.601039 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.621045 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.636340 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.658311 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:47Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.678734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.678798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.679259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.679314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.679339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.781208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.781240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.781247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.781260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.781269 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.870445 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.870496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.870496 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:47 crc kubenswrapper[4774]: E1001 13:37:47.870635 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:47 crc kubenswrapper[4774]: E1001 13:37:47.871020 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:47 crc kubenswrapper[4774]: E1001 13:37:47.871121 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.882868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.882887 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.882895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.882907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.882915 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.986262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.986294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.986302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.986316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:47 crc kubenswrapper[4774]: I1001 13:37:47.986325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:47Z","lastTransitionTime":"2025-10-01T13:37:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.088658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.088688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.088697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.088709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.088719 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.178360 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/0.log" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.181125 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.181673 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.190602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.190632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.190640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.190655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.190665 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.194866 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.209746 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.224661 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.242091 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.259906 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.274026 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.286712 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.292864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.292914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.292927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.292946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.292958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.307395 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.323836 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.336771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.352953 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.368978 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.390377 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"eflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121151 5985 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121312 5985 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.121718 5985 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.122014 5985 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122118 5985 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122228 5985 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122685 5985 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.394922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.395048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.395161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.395243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.395335 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.407130 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.425548 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.440270 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.497732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.497785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.497802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.497825 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.497843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.601123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.601203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.601224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.601254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.601275 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.703905 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.703962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.703978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.704002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.704017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.806501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.806608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.806633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.806663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.806685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.869811 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:48 crc kubenswrapper[4774]: E1001 13:37:48.869967 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.888587 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.905349 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.910180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.910243 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.910264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.910292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.910309 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:48Z","lastTransitionTime":"2025-10-01T13:37:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.933041 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.954723 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.974560 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.988257 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:48 crc kubenswrapper[4774]: I1001 13:37:48.998976 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.012116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.012152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.012163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.012176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.012186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.013291 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.023949 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.034816 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.046573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.058743 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.087243 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"eflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121151 5985 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121312 5985 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.121718 5985 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.122014 5985 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122118 5985 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122228 5985 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122685 5985 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.102748 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.116999 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.129577 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.186609 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/1.log" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.187357 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/0.log" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.191026 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086" exitCode=1 Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.191077 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.191161 4774 scope.go:117] "RemoveContainer" containerID="885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.192068 4774 scope.go:117] "RemoveContainer" containerID="c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086" Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.192445 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.211774 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.220027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.220064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.220074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.220087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.220097 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.232692 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.249184 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.265859 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.286154 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.305882 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.322739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.322802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.322813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.322833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.322845 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.340399 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://885d2b6aae5c15b71ccdd5fba694b5ad7569797fbd34cee05e21b9dcc8d03d44\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"message\\\":\\\"eflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121151 5985 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:37:46.121312 5985 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.121718 5985 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:37:46.122014 5985 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122118 5985 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122228 5985 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:37:46.122685 5985 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.360175 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.378378 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.396139 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.412263 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.424888 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.425945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.426009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.426026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.426052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.426070 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.443633 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.457680 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.474428 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.485298 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.528564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.528624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.528639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.528659 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.528671 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.561783 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.561952 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.562045 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:37:53.562028651 +0000 UTC m=+45.451659238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.632163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.632221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.632244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.632272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.632294 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.735596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.735652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.735668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.735693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.735709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.838885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.838943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.838963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.838987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.839004 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.870081 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.870177 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.870212 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.870370 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.870542 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:49 crc kubenswrapper[4774]: E1001 13:37:49.870779 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.942215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.942279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.942297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.942321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:49 crc kubenswrapper[4774]: I1001 13:37:49.942339 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:49Z","lastTransitionTime":"2025-10-01T13:37:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.045116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.045172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.045189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.045212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.045229 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.147966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.148025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.148043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.148068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.148085 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.197289 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/1.log" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.203099 4774 scope.go:117] "RemoveContainer" containerID="c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086" Oct 01 13:37:50 crc kubenswrapper[4774]: E1001 13:37:50.203374 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.221584 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.239590 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.251077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.251135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.251152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.251179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.251197 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.260535 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.296925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.319422 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.336843 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.350909 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.353296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.353356 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.353376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.353403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.353418 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.371693 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.385489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.398924 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.415885 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.429831 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.444208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.456195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.456654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.456745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.456834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.456922 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.460665 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.469818 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.488296 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:50Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.559789 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.559820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.559830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.559847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.559858 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.661891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.661936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.661945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.661961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.661971 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.765687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.765752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.765771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.765803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.765823 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.869206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.869271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.869289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.869316 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.869335 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.870120 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:50 crc kubenswrapper[4774]: E1001 13:37:50.870295 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.973474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.973516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.973534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.973558 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:50 crc kubenswrapper[4774]: I1001 13:37:50.973575 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:50Z","lastTransitionTime":"2025-10-01T13:37:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.076819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.076865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.076883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.076906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.076924 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.179792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.179876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.179897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.179927 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.179946 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.283251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.283341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.283365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.283440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.283504 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.386963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.387065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.387094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.387235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.387353 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.493728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.493786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.493799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.493824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.493847 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.596679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.596739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.596763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.596793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.596828 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.698431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.698479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.698505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.698518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.698527 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.801696 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.801755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.801766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.801794 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.801807 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.870149 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.870261 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:51 crc kubenswrapper[4774]: E1001 13:37:51.870287 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:51 crc kubenswrapper[4774]: E1001 13:37:51.870564 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.870799 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:51 crc kubenswrapper[4774]: E1001 13:37:51.870899 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.904009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.904047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.904054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.904071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:51 crc kubenswrapper[4774]: I1001 13:37:51.904081 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:51Z","lastTransitionTime":"2025-10-01T13:37:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.007199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.007264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.007283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.007309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.007329 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.110325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.110378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.110391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.110433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.110448 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.212043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.212069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.212077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.212090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.212099 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.315123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.315156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.315163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.315178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.315187 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.417417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.417481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.417490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.417503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.417515 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.521116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.521152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.521160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.521172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.521181 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.623904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.623951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.623964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.623982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.623995 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.726514 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.726553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.726565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.726580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.726590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.829067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.829125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.829145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.829171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.829193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.870500 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:52 crc kubenswrapper[4774]: E1001 13:37:52.870697 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.931422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.931507 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.931524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.931544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:52 crc kubenswrapper[4774]: I1001 13:37:52.931558 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:52Z","lastTransitionTime":"2025-10-01T13:37:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.034756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.034799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.034811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.034828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.034838 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.138258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.138318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.138336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.138360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.138379 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.241497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.241542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.241555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.241572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.241584 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.345072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.345142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.345157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.345175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.345186 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.448280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.448313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.448323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.448339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.448348 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.550796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.550831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.550841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.550855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.550865 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.607700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:53 crc kubenswrapper[4774]: E1001 13:37:53.607925 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:53 crc kubenswrapper[4774]: E1001 13:37:53.608046 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:38:01.608018947 +0000 UTC m=+53.497649574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.653526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.653563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.653574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.653591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.653604 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.756395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.756496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.756519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.756590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.756612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.859703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.859750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.859767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.859791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.859809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.870154 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.870230 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.870319 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:53 crc kubenswrapper[4774]: E1001 13:37:53.870315 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:53 crc kubenswrapper[4774]: E1001 13:37:53.870407 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:53 crc kubenswrapper[4774]: E1001 13:37:53.870572 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.962400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.962436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.962444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.962476 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:53 crc kubenswrapper[4774]: I1001 13:37:53.962488 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:53Z","lastTransitionTime":"2025-10-01T13:37:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.064937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.064969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.064978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.064992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.065001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.167413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.167461 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.167473 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.167491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.167503 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.270348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.270425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.270442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.270506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.270524 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.372898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.372964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.372976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.372994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.373006 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.474807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.474881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.474897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.474939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.474952 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.578792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.578881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.578903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.578929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.578947 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.682291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.682349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.682369 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.682395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.682414 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.786973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.787036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.787061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.787089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.787111 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.871004 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:54 crc kubenswrapper[4774]: E1001 13:37:54.871270 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.889101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.889158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.889179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.889206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.889223 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.992613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.992678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.992697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.992721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:54 crc kubenswrapper[4774]: I1001 13:37:54.992741 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:54Z","lastTransitionTime":"2025-10-01T13:37:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.095773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.095834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.095852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.095877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.095893 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.198707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.198781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.198798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.198822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.198847 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.301599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.301654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.301670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.301697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.301715 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.403543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.403585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.403593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.403607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.403615 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.505883 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.505912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.505923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.505935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.505944 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.609932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.609998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.610010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.610032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.610049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.713025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.713104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.713118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.713141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.713154 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.816053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.816142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.816167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.816194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.816211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.869984 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.870041 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.870012 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:55 crc kubenswrapper[4774]: E1001 13:37:55.870153 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:55 crc kubenswrapper[4774]: E1001 13:37:55.870251 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:55 crc kubenswrapper[4774]: E1001 13:37:55.870477 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.918752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.918781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.918790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.918803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:55 crc kubenswrapper[4774]: I1001 13:37:55.918813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:55Z","lastTransitionTime":"2025-10-01T13:37:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.021994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.022053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.022073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.022099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.022117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.125388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.125425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.125434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.125465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.125474 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.227591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.227818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.227890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.227950 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.228006 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.332078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.332746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.333200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.333478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.333700 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.437562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.437620 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.437637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.437664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.437681 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.551297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.551545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.551569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.551602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.551624 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.654003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.654042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.654052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.654066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.654076 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.677130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.677173 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.677182 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.677194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.677204 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.691989 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:56Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.695314 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.695338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.695346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.695359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.695368 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.713085 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:56Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.717537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.717575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.717586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.717605 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.717614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.731486 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:56Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.735925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.735972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.735988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.736010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.736024 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.752332 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:56Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.756193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.756238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.756254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.756273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.756287 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.775645 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:56Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.775791 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.778772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.778843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.778864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.778901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.778925 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.870533 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:56 crc kubenswrapper[4774]: E1001 13:37:56.870706 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.881716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.881870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.881953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.882029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.882107 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.984716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.984767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.984781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.984799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:56 crc kubenswrapper[4774]: I1001 13:37:56.984813 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:56Z","lastTransitionTime":"2025-10-01T13:37:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.087992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.088098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.088128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.088168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.088193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.191852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.191916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.191934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.191959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.191978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.295295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.295359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.295377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.295404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.295424 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.398941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.399027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.399047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.399073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.399091 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.501834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.501926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.501944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.501968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.501988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.605248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.605307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.605329 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.605358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.605379 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.708872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.708913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.708926 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.708943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.708955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.810815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.810890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.810907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.810929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.810944 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.870070 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.870126 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.870075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:57 crc kubenswrapper[4774]: E1001 13:37:57.870215 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:57 crc kubenswrapper[4774]: E1001 13:37:57.870370 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:57 crc kubenswrapper[4774]: E1001 13:37:57.870495 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.914132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.914185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.914257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.914289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:57 crc kubenswrapper[4774]: I1001 13:37:57.914379 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:57Z","lastTransitionTime":"2025-10-01T13:37:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.017632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.017687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.017700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.017718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.017730 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.121090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.121303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.121334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.121363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.121388 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.224720 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.224790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.224807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.224834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.224852 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.328149 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.328216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.328237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.328268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.328288 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.431271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.431327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.431348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.431375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.431397 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.534273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.534343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.534366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.534612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.534639 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.637767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.637824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.637847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.637876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.637896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.741368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.741411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.741426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.741471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.741484 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.844614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.844688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.844708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.844739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.844762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.869546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:37:58 crc kubenswrapper[4774]: E1001 13:37:58.869723 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.898037 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.918790 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.941597 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.947282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.947372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.947390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.947412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.947428 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:58Z","lastTransitionTime":"2025-10-01T13:37:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.962769 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:58 crc kubenswrapper[4774]: I1001 13:37:58.987849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.003345 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.019799 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.037055 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.049004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.049051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.049064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.049091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.049104 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.056063 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.066873 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.077522 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.087404 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.104297 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.116367 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.126030 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.139809 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:37:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.152288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.152336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.152346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.152363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.152377 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.254902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.254967 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.254983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.255008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.255052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.357300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.357348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.357360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.357376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.357386 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.459793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.459872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.459893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.459917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.459936 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.562815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.562894 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.562912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.562940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.562958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.665604 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.665668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.665681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.665699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.665713 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.768671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.768722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.768740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.768765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.768782 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.870244 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:37:59 crc kubenswrapper[4774]: E1001 13:37:59.870417 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.870576 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:37:59 crc kubenswrapper[4774]: E1001 13:37:59.870694 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.870767 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:37:59 crc kubenswrapper[4774]: E1001 13:37:59.870849 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.872430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.872510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.872528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.872549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.872567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.975227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.975289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.975307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.975333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:37:59 crc kubenswrapper[4774]: I1001 13:37:59.975353 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:37:59Z","lastTransitionTime":"2025-10-01T13:37:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.078660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.078724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.078740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.078767 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.078783 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.181319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.181389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.181407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.181431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.181479 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.284918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.285021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.285037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.285060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.285077 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.387504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.387599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.387637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.387670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.387691 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.490757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.490822 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.490841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.490868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.490885 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.594327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.594390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.594413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.594437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.594481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.696935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.697001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.697021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.697043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.697057 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.800533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.800593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.800610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.800634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.800650 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.870569 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:00 crc kubenswrapper[4774]: E1001 13:38:00.870786 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.902655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.902693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.902706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.902722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:00 crc kubenswrapper[4774]: I1001 13:38:00.902734 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:00Z","lastTransitionTime":"2025-10-01T13:38:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.005357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.005413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.005429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.005520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.005540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.108190 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.108234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.108245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.108261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.108273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.211555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.211626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.211645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.211672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.211690 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.315787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.315866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.315954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.315998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.316022 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.419198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.419277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.419301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.419339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.419360 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.522641 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.522701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.522725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.522753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.522775 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.626096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.626181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.626211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.626240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.626263 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.690548 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.690781 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.690898 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:38:17.690861735 +0000 UTC m=+69.580492372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.729581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.729652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.729677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.729730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.729750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.791787 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.791975 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:38:33.791938414 +0000 UTC m=+85.681569041 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.792045 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.792106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.792158 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.792223 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792270 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792301 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792321 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792386 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792428 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792403 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:33.792382027 +0000 UTC m=+85.682012654 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792551 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:33.792522721 +0000 UTC m=+85.682153388 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792567 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792585 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:33.792565143 +0000 UTC m=+85.682195950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792594 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792612 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.792671 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:38:33.792656335 +0000 UTC m=+85.682286972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.832899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.832942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.832958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.832982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.833002 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.869822 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.869900 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.869824 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.869972 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.870066 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:01 crc kubenswrapper[4774]: E1001 13:38:01.870278 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.936562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.936624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.936661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.936732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:01 crc kubenswrapper[4774]: I1001 13:38:01.936755 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:01Z","lastTransitionTime":"2025-10-01T13:38:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.040071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.040165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.040189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.040222 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.040245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.143383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.143497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.143516 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.143540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.143558 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.246685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.246748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.246768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.246792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.246809 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.350242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.350309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.350332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.350365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.350387 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.453710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.453778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.453800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.453826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.453843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.557290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.557681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.557964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.558206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.558360 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.661866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.661940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.661962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.661991 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.662017 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.764753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.764811 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.764834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.764860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.764882 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.868256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.868327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.868350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.868378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.868400 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.869825 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:02 crc kubenswrapper[4774]: E1001 13:38:02.870016 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.971321 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.971389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.971410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.971438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:02 crc kubenswrapper[4774]: I1001 13:38:02.971497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:02Z","lastTransitionTime":"2025-10-01T13:38:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.074179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.074239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.074259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.074286 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.074306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.178071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.178155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.178175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.178201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.178218 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.282829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.282899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.282917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.282941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.282958 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.385977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.386053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.386079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.386110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.386133 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.489874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.489933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.489952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.489979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.489998 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.593447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.594268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.594491 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.594688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.594829 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.697998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.698611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.698633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.698655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.698669 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.801142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.801483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.801596 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.801701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.801786 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.870098 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.870136 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.870144 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:03 crc kubenswrapper[4774]: E1001 13:38:03.870752 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:03 crc kubenswrapper[4774]: E1001 13:38:03.871288 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:03 crc kubenswrapper[4774]: E1001 13:38:03.871423 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.904574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.904655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.904680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.904713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:03 crc kubenswrapper[4774]: I1001 13:38:03.904735 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:03Z","lastTransitionTime":"2025-10-01T13:38:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.007868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.007925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.007940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.007959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.007973 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.110943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.110989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.110999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.111015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.111025 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.213583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.213645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.213662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.213687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.213706 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.316841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.316944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.316963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.317025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.317043 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.419647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.419702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.419719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.419743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.419762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.523559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.523601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.523610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.523625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.523635 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.627612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.627700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.627726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.627759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.627782 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.731109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.731168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.731186 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.731210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.731229 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.833530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.833587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.833603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.833627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.833644 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.869992 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:04 crc kubenswrapper[4774]: E1001 13:38:04.870179 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.870977 4774 scope.go:117] "RemoveContainer" containerID="c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.937143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.937497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.937515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.937540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:04 crc kubenswrapper[4774]: I1001 13:38:04.937558 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:04Z","lastTransitionTime":"2025-10-01T13:38:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.040412 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.040517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.040538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.040568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.040590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.144512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.144556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.144578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.144607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.144627 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.248009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.248065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.248083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.248105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.248121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.259959 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/1.log" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.265759 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.266211 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.289657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.305920 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.332771 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.351288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.351348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.351371 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.351399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.351422 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.357232 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.390621 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.411990 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.432562 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.453716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.453780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.453796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.453823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.453864 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.459189 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.475158 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.486542 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.502180 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.514639 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.528319 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.543024 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.554167 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.556381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.556424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.556440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.556478 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.556499 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.570169 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.658898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.658944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.658957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.658973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.658984 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.762304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.762372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.762388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.762414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.762432 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.864992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.865058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.865079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.865104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.865121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.869492 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.869569 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.869594 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:05 crc kubenswrapper[4774]: E1001 13:38:05.869717 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:05 crc kubenswrapper[4774]: E1001 13:38:05.869837 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:05 crc kubenswrapper[4774]: E1001 13:38:05.870009 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.968045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.968108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.968119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.968134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:05 crc kubenswrapper[4774]: I1001 13:38:05.968146 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:05Z","lastTransitionTime":"2025-10-01T13:38:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.071801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.071868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.071890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.071918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.071939 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.142612 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.160919 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.161913 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.175625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.175710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.175733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.175765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.175785 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.183914 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.199581 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.223612 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.242131 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.260925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.273546 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/2.log" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.275069 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/1.log" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.277549 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.278178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.278217 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.278236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.278300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.278318 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.280092 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" exitCode=1 Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.280310 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.280675 4774 scope.go:117] "RemoveContainer" containerID="c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.285156 4774 scope.go:117] "RemoveContainer" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" Oct 01 13:38:06 crc kubenswrapper[4774]: E1001 13:38:06.287047 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.306522 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.323798 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.335961 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.354275 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.377395 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.381992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.382031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.382044 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.382063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.382078 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.396222 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.415722 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.437545 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.457934 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.477507 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.485613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.485657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.485674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.485697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.485715 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.497981 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.514866 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.537603 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.558294 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.575020 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.588919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.589170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.589315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.589518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.589669 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.593707 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.613558 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.633013 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.653786 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.674301 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.693075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.693148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.693169 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.693197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.693219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.708500 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8353a8fd57376dac606e1078b78d3359674d6554bc958fdf66d8a9f0542f086\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:37:48Z\\\",\\\"message\\\":\\\" 6270 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:37:48.153683 6270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:37:48.153704 6270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:37:48.153737 6270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:37:48.153766 6270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:37:48.153775 6270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:37:48.153815 6270 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:37:48.153828 6270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:37:48.153839 6270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:37:48.153853 6270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:37:48.153858 6270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:37:48.153875 6270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:37:48.153898 6270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:37:48.153901 6270 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:37:48.153935 6270 factory.go:656] Stopping watch factory\\\\nI1001 13:37:48.153950 6270 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.733850 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.755348 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.782091 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.796157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.796215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.796235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.796260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.796278 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.804155 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.822061 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.870024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:06 crc kubenswrapper[4774]: E1001 13:38:06.870211 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.899539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.899594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.899608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.899628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:06 crc kubenswrapper[4774]: I1001 13:38:06.899642 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:06Z","lastTransitionTime":"2025-10-01T13:38:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.002975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.003033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.003051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.003075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.003092 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.106366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.106771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.106917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.107073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.107222 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.155715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.156254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.156496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.156769 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.156978 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.186496 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.191995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.192032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.192045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.192061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.192074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.213108 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.217981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.218031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.218048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.218071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.218088 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.236256 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.240280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.240354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.240373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.240399 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.240416 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.262342 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.267895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.267940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.267961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.267985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.268002 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.287073 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/2.log" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.294275 4774 scope.go:117] "RemoveContainer" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.296922 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.297805 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.298090 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.299901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.299935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.299947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.299964 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.299977 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.313067 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.326134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.340242 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.358424 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.379575 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.400038 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.401873 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.401920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.401943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.401971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.401991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.419834 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.434677 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.464752 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.481544 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.505085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.505373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.505657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.505871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.506223 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.505503 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.530719 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.550586 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.566488 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.585566 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.602768 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.609080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.609109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.609120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.609135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.609145 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.622567 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.711617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.711680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.711699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.711723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.711739 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.814132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.814178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.814195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.814219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.814236 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.869759 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.869801 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.869946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.870110 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.870358 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:07 crc kubenswrapper[4774]: E1001 13:38:07.870808 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.918160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.919019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.919262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.919505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:07 crc kubenswrapper[4774]: I1001 13:38:07.919692 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:07Z","lastTransitionTime":"2025-10-01T13:38:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.023193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.023701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.023861 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.024025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.024160 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.128995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.129508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.129689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.129978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.130173 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.232714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.232750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.232761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.232777 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.232788 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.336721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.337109 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.337368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.337614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.337806 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.441324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.441730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.441860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.441966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.442067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.544997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.545068 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.545089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.545120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.545143 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.648589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.648996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.649155 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.649397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.649626 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.753252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.753326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.753344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.753368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.753389 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.856434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.856580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.856616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.856642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.856659 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.869640 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:08 crc kubenswrapper[4774]: E1001 13:38:08.869826 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.894919 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.912962 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.928937 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.947591 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.959220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.959427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.959554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.959646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.959722 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:08Z","lastTransitionTime":"2025-10-01T13:38:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.970600 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:08 crc kubenswrapper[4774]: I1001 13:38:08.994958 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.016405 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.039419 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.063267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.063366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.063401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.063434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.063509 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.075430 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.092392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.108740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.130211 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.148590 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.165232 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.166753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.166979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.167189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.167364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.167575 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.180441 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.202679 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.216657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.270533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.270984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.271248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.271656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.271962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.375652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.375815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.375844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.375875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.375897 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.479415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.479526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.479550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.479577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.479597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.583907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.583994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.584018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.584052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.584078 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.687295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.687768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.687785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.687806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.687822 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.791670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.791739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.791756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.791781 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.791801 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.869406 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.869421 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.869546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:09 crc kubenswrapper[4774]: E1001 13:38:09.869745 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:09 crc kubenswrapper[4774]: E1001 13:38:09.869984 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:09 crc kubenswrapper[4774]: E1001 13:38:09.870053 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.895685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.895755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.895776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.895802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.895821 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.998350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.998505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.998567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.998593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:09 crc kubenswrapper[4774]: I1001 13:38:09.998611 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:09Z","lastTransitionTime":"2025-10-01T13:38:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.102123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.102179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.102197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.102223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.102241 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.205574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.205651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.205675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.205707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.205735 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.309108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.309183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.309212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.309247 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.309273 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.413705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.413797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.413848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.413874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.413890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.517544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.517602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.517619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.517642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.517661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.620442 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.620538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.620555 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.620581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.620597 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.724219 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.724385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.724414 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.724510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.724567 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.827240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.827308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.827325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.827348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.827364 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.870063 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:10 crc kubenswrapper[4774]: E1001 13:38:10.870263 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.930508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.930550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.930562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.930579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:10 crc kubenswrapper[4774]: I1001 13:38:10.930590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:10Z","lastTransitionTime":"2025-10-01T13:38:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.034854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.034899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.034911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.034929 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.034941 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.138508 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.138544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.138556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.138578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.138595 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.241483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.241526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.241540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.241559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.241576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.344930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.344978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.344995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.345020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.345038 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.448285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.448729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.448881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.449030 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.449168 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.552509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.552571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.552588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.552611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.552629 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.655523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.655583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.655603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.655648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.655729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.759236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.759313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.759337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.759367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.759390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.873199 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.873224 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.873231 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:11 crc kubenswrapper[4774]: E1001 13:38:11.882064 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:11 crc kubenswrapper[4774]: E1001 13:38:11.873739 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.882279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.882334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.882357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.882391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.882413 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:11 crc kubenswrapper[4774]: E1001 13:38:11.882526 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.985521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.985580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.985600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.985626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:11 crc kubenswrapper[4774]: I1001 13:38:11.985643 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:11Z","lastTransitionTime":"2025-10-01T13:38:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.088435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.088818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.088920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.089019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.089144 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.191600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.191661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.191680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.191704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.191723 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.294115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.294185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.294206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.294260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.294286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.396934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.397256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.397348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.397470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.397573 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.501281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.501335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.501355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.501379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.501395 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.604270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.604331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.604341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.604358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.604368 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.707229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.707271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.707279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.707293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.707302 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.810671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.810764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.810782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.810806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.810824 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.870481 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:12 crc kubenswrapper[4774]: E1001 13:38:12.870698 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.913496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.914552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.914796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.914961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:12 crc kubenswrapper[4774]: I1001 13:38:12.915090 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:12Z","lastTransitionTime":"2025-10-01T13:38:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.018270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.018331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.018349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.018374 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.018394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.121526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.121602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.121611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.121624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.121634 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.225262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.226092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.226485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.226800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.226940 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.329342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.329379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.329390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.329405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.329417 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.432048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.432099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.432111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.432127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.432139 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.534933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.535102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.535116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.535132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.535144 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.639402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.639463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.639479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.639497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.639510 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.742289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.742359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.742376 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.742401 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.742417 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.845886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.845948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.845966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.845990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.846009 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.869669 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.869729 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.869744 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:13 crc kubenswrapper[4774]: E1001 13:38:13.869961 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:13 crc kubenswrapper[4774]: E1001 13:38:13.870053 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:13 crc kubenswrapper[4774]: E1001 13:38:13.870177 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.949364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.949515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.949542 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.949571 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:13 crc kubenswrapper[4774]: I1001 13:38:13.949594 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:13Z","lastTransitionTime":"2025-10-01T13:38:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.051963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.052032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.052049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.052073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.052091 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.154951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.154990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.155001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.155018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.155031 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.257515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.257567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.257582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.257600 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.257612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.359594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.359651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.359662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.359675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.359685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.462156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.462209 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.462227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.462251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.462267 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.564033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.564061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.564069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.564081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.564090 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.666048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.666092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.666103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.666118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.666128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.768954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.769026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.769049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.769079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.769103 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.870580 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:14 crc kubenswrapper[4774]: E1001 13:38:14.871269 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.872391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.872439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.872479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.872499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.872511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.974895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.974958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.974976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.975001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:14 crc kubenswrapper[4774]: I1001 13:38:14.975018 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:14Z","lastTransitionTime":"2025-10-01T13:38:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.078200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.078239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.078259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.078275 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.078286 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.181151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.181215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.181233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.181260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.181279 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.283635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.283701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.283717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.283740 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.283757 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.386672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.386728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.386741 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.386763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.386775 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.489142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.489216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.489238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.489267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.489289 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.591965 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.592012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.592027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.592045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.592060 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.694477 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.694539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.694557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.694585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.694609 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.797349 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.797435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.797489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.797523 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.797545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.870542 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.870566 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:15 crc kubenswrapper[4774]: E1001 13:38:15.870785 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.870564 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:15 crc kubenswrapper[4774]: E1001 13:38:15.870895 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:15 crc kubenswrapper[4774]: E1001 13:38:15.871072 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.900915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.900982 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.901005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.901032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:15 crc kubenswrapper[4774]: I1001 13:38:15.901054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:15Z","lastTransitionTime":"2025-10-01T13:38:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.003527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.003561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.003572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.003591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.003603 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.106132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.106194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.106204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.106218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.106226 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.232289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.232331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.232343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.232357 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.232368 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.334332 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.334417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.334431 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.334506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.334523 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.436795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.436842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.436852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.436869 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.436903 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.539241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.539273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.539282 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.539317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.539330 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.642041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.642080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.642091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.642107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.642117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.745385 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.745522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.745608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.745653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.745664 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.848069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.848095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.848103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.848115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.848125 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.869843 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:16 crc kubenswrapper[4774]: E1001 13:38:16.870026 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.879920 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.950771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.950818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.950830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.950845 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:16 crc kubenswrapper[4774]: I1001 13:38:16.950858 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:16Z","lastTransitionTime":"2025-10-01T13:38:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.053637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.053677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.053688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.053702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.053712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.156790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.156836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.156848 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.156866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.156877 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.259383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.259503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.259515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.259531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.259543 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.362607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.362646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.362658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.362674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.362684 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.465515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.465557 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.465568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.465586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.465598 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.540544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.540594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.540607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.540624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.540635 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.553918 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.558885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.558934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.558952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.558972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.558989 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.576483 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.580335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.580366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.580375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.580388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.580398 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.597411 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.600884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.600913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.600922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.600933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.600941 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.616816 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.619619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.619642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.619649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.619663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.619671 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.631840 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.631950 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.633661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.633679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.633690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.633702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.633710 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.735802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.735826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.735835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.735847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.735856 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.779052 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.779205 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.779295 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:38:49.779274664 +0000 UTC m=+101.668905371 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.838784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.838832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.838844 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.838864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.838876 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.869757 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.869784 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.869765 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.869898 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.870065 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:17 crc kubenswrapper[4774]: E1001 13:38:17.870201 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.941998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.942045 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.942057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.942074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:17 crc kubenswrapper[4774]: I1001 13:38:17.942086 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:17Z","lastTransitionTime":"2025-10-01T13:38:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.044944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.044981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.044989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.045004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.045019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.147917 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.147977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.147994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.148017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.148034 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.250949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.251009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.251027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.251051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.251067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.353256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.353297 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.353317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.353333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.353345 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.455526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.455570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.455582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.455598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.455624 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.557372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.557409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.557419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.557433 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.557446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.658934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.658973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.658984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.659001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.659012 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.761271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.761311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.761320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.761334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.761342 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.863817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.863855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.863866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.863881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.863892 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.870093 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:18 crc kubenswrapper[4774]: E1001 13:38:18.870239 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.884825 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.903023 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.920178 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.937905 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.953568 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966188 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.966545 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:18Z","lastTransitionTime":"2025-10-01T13:38:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.981040 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:18 crc kubenswrapper[4774]: I1001 13:38:18.995103 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.014420 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.028789 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.039104 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.050591 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.069120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.069215 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.069234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.069259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.069274 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.070282 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.082059 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.093768 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.103871 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.115180 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.127606 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.171552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.171583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.171594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.171610 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.171621 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.273628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.273667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.273678 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.273692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.273703 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.375238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.375271 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.375281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.375295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.375304 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.478366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.478404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.478413 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.478428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.478438 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.581319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.581365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.581379 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.581397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.581408 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.685446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.685520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.685536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.685560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.685578 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.788933 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.788977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.788988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.789004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.789016 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.869581 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.869653 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:19 crc kubenswrapper[4774]: E1001 13:38:19.869723 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:19 crc kubenswrapper[4774]: E1001 13:38:19.869826 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.869921 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:19 crc kubenswrapper[4774]: E1001 13:38:19.870018 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.893709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.893742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.893800 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.893819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.893854 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.996330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.996389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.996411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.996443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:19 crc kubenswrapper[4774]: I1001 13:38:19.996500 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:19Z","lastTransitionTime":"2025-10-01T13:38:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.099474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.099538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.099552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.099567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.099578 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.202407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.202509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.202527 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.202549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.202565 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.304698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.304754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.304768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.304786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.304798 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.407681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.407726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.407739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.407755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.407767 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.510810 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.510850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.510858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.510872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.510881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.613547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.613593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.613608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.613629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.613643 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.716177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.716233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.716258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.716367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.716394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.819859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.819918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.819940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.819975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.819997 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.870135 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:20 crc kubenswrapper[4774]: E1001 13:38:20.870850 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.871330 4774 scope.go:117] "RemoveContainer" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" Oct 01 13:38:20 crc kubenswrapper[4774]: E1001 13:38:20.871648 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.923772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.923816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.923826 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.923843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:20 crc kubenswrapper[4774]: I1001 13:38:20.923853 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:20Z","lastTransitionTime":"2025-10-01T13:38:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.026967 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.027004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.027015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.027055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.027067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.129702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.129747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.129757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.129776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.129788 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.232627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.232654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.232662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.232676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.232685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.335543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.335606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.335628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.335657 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.335679 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.438167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.438201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.438213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.438227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.438239 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.540668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.540707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.540717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.540731 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.540744 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.643127 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.643160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.643168 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.643181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.643190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.745681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.745723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.745733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.745749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.745760 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.848076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.848112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.848122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.848139 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.848150 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.870276 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.870299 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:21 crc kubenswrapper[4774]: E1001 13:38:21.870391 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.870276 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:21 crc kubenswrapper[4774]: E1001 13:38:21.870654 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:21 crc kubenswrapper[4774]: E1001 13:38:21.870755 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.950553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.950611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.950631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.950647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:21 crc kubenswrapper[4774]: I1001 13:38:21.950659 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:21Z","lastTransitionTime":"2025-10-01T13:38:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.056254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.056292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.056304 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.056319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.056330 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.158564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.158611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.158623 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.158642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.158654 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.264569 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.264611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.264621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.264639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.264651 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.366922 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.366986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.367003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.367027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.367044 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.469745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.469842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.469851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.469864 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.469875 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.571952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.571998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.572012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.572025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.572034 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.674535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.674679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.674699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.674717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.674729 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.778655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.778697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.778711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.778729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.778743 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.870539 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:22 crc kubenswrapper[4774]: E1001 13:38:22.870708 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.881559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.881598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.881611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.881633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.881649 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.983324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.983370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.983383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.983400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:22 crc kubenswrapper[4774]: I1001 13:38:22.983411 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:22Z","lastTransitionTime":"2025-10-01T13:38:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.086761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.086817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.086833 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.086856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.086876 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.190793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.190851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.190867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.190890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.190907 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.293426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.293537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.293562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.293591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.293614 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.349692 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/0.log" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.349739 4774 generic.go:334] "Generic (PLEG): container finished" podID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" containerID="2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155" exitCode=1 Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.349766 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerDied","Data":"2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.350095 4774 scope.go:117] "RemoveContainer" containerID="2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.368175 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.393256 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.396893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.397016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.397040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.397072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.397094 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.415313 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.435720 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.447972 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.459304 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.473776 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.489134 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.500338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.500417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.500441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.500497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.500519 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.509375 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.528619 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.540599 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.551787 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.565378 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.584970 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.602772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.602878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.602935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.602953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.602978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.603001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.616551 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.635125 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.662283 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.705493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.705526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.705534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.705567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.705576 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.807535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.807578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.807587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.807602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.807612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.869803 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.869915 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:23 crc kubenswrapper[4774]: E1001 13:38:23.869995 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:23 crc kubenswrapper[4774]: E1001 13:38:23.870159 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.870317 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:23 crc kubenswrapper[4774]: E1001 13:38:23.870447 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.910504 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.910550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.910561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.910579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:23 crc kubenswrapper[4774]: I1001 13:38:23.910593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:23Z","lastTransitionTime":"2025-10-01T13:38:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.014052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.014113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.014131 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.014157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.014175 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.117497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.117561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.117578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.117603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.117621 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.220942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.220993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.221006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.221026 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.221039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.323000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.323028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.323037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.323048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.323057 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.354229 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/0.log" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.354274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerStarted","Data":"8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.370849 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.389068 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.423569 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.425007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.425090 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.425115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.425147 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.425171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.432933 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.447166 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.461040 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.483153 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.499965 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.513734 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.527489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.527543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.527556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.527574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.527586 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.533024 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.548869 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.563528 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.579105 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.593434 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.607682 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.621968 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.629387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.629427 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.629440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.629474 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.629487 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.635208 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.644544 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:24Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.732485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.732511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.732519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.732531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.732541 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.834912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.834994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.835020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.835053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.835076 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.870209 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:24 crc kubenswrapper[4774]: E1001 13:38:24.870390 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.938602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.938645 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.938656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.938673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:24 crc kubenswrapper[4774]: I1001 13:38:24.938685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:24Z","lastTransitionTime":"2025-10-01T13:38:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.041254 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.041337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.041375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.041407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.041434 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.144634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.144702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.144724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.144755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.144777 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.247972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.248050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.248073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.248104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.248127 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.350865 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.350912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.350930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.350949 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.350962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.453876 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.453936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.453957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.453979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.453997 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.557333 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.557418 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.557517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.557552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.557575 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.660738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.660801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.660819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.660843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.660862 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.764167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.764237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.764258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.764284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.764302 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.867220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.867305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.867337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.867366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.867388 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.869843 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.869869 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.869869 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:25 crc kubenswrapper[4774]: E1001 13:38:25.869956 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:25 crc kubenswrapper[4774]: E1001 13:38:25.870072 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:25 crc kubenswrapper[4774]: E1001 13:38:25.870212 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.970189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.970230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.970242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.970258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:25 crc kubenswrapper[4774]: I1001 13:38:25.970270 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:25Z","lastTransitionTime":"2025-10-01T13:38:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.073285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.073317 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.073325 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.073338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.073347 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.175819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.175868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.175880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.175898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.175909 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.280002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.280064 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.280087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.280119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.280141 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.382639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.382716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.382733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.382756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.382774 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.485537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.485593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.485613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.485636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.485653 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.588087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.588145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.588161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.588183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.588199 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.690715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.690771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.690792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.690816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.690834 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.793280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.793315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.793323 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.793338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.793349 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.869892 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:26 crc kubenswrapper[4774]: E1001 13:38:26.870086 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.896063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.896096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.896105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.896118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.896126 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.998589 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.998658 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.998683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.998716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:26 crc kubenswrapper[4774]: I1001 13:38:26.998738 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:26Z","lastTransitionTime":"2025-10-01T13:38:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.102108 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.102170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.102187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.102211 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.102227 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.205858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.205923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.205959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.205999 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.206021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.309614 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.309693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.309717 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.309743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.309762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.413296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.413344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.413360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.413560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.413777 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.517162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.517231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.517249 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.517273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.517290 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.621221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.621292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.621313 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.621340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.621566 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.725223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.725277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.725293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.725315 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.725332 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.828176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.828583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.828856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.829015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.829171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.857682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.858066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.858234 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.858394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.858582 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.869972 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.870119 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.870337 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.870419 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.870591 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.870661 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.874525 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.879700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.879809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.879834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.879863 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.879884 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.899258 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.904204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.904263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.904280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.904305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.904325 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.924826 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.929577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.929637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.929654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.929677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.929700 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.954538 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.958992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.959046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.959063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.959085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.959101 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.976815 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:27Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:27 crc kubenswrapper[4774]: E1001 13:38:27.977134 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.979081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.979144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.979164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.979192 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:27 crc kubenswrapper[4774]: I1001 13:38:27.979213 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:27Z","lastTransitionTime":"2025-10-01T13:38:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.081400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.081445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.081483 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.081502 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.081516 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.183366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.183435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.183496 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.183528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.183589 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.287135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.287191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.287213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.287245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.287267 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.390979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.391069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.391104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.391137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.391159 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.494142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.494205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.494228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.494259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.494283 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.597305 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.597365 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.597382 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.597407 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.597428 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.700074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.700103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.700111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.700125 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.700133 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.802899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.802946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.802955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.802971 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.802985 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.870005 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:28 crc kubenswrapper[4774]: E1001 13:38:28.870245 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.886262 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.905842 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.919522 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.923858 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.923890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.923898 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.923913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.923923 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:28Z","lastTransitionTime":"2025-10-01T13:38:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.938808 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.952041 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.967936 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.981132 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:28 crc kubenswrapper[4774]: I1001 13:38:28.992266 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:28Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.004062 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.017282 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.025627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.025664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.025675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.025691 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.025699 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.030069 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.043777 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.067010 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.090019 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.109108 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.125112 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.128294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.128335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.128372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.128410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.128425 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.137735 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.151067 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.231492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.231552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.231572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.231595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.231612 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.334279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.334583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.335137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.335341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.335783 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.440051 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.440101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.440116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.440132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.440142 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.543345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.543400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.543417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.543438 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.543474 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.645972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.646034 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.646053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.646080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.646098 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.749391 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.749494 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.749511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.749533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.749553 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.852770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.852838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.852861 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.852889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.852912 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.869987 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.870004 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.870023 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:29 crc kubenswrapper[4774]: E1001 13:38:29.870570 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:29 crc kubenswrapper[4774]: E1001 13:38:29.870304 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:29 crc kubenswrapper[4774]: E1001 13:38:29.870678 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.955907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.956272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.956417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.956606 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:29 crc kubenswrapper[4774]: I1001 13:38:29.956762 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:29Z","lastTransitionTime":"2025-10-01T13:38:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.059687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.059753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.059771 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.059796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.059814 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.162405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.162470 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.162481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.162518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.162530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.265214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.265301 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.265327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.265341 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.265350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.368560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.368649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.368666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.368690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.368707 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.471936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.471983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.472000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.472021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.472039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.574912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.574958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.574974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.574995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.575011 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.678603 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.678703 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.678723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.678782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.678805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.781503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.781563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.781585 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.781617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.781636 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.870428 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:30 crc kubenswrapper[4774]: E1001 13:38:30.870688 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.883890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.883930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.883942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.883957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.883969 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.985956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.986006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.986014 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.986027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:30 crc kubenswrapper[4774]: I1001 13:38:30.986035 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:30Z","lastTransitionTime":"2025-10-01T13:38:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.089007 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.089069 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.089086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.089110 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.089128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.192544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.192624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.192648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.192675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.192696 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.295613 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.295702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.295728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.295758 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.295781 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.398058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.398133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.398158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.398188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.398211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.501599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.501673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.501698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.501730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.501753 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.604835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.604915 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.604935 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.604958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.604974 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.708674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.708749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.708775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.708804 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.708825 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.812220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.812256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.812267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.812351 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.812367 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.869391 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.869435 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.869401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:31 crc kubenswrapper[4774]: E1001 13:38:31.869570 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:31 crc kubenswrapper[4774]: E1001 13:38:31.869783 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:31 crc kubenswrapper[4774]: E1001 13:38:31.869942 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.914661 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.914733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.914759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.914789 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:31 crc kubenswrapper[4774]: I1001 13:38:31.914812 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:31Z","lastTransitionTime":"2025-10-01T13:38:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.017911 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.017984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.018008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.018037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.018060 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.121083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.121233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.121256 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.121285 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.121304 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.224648 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.224909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.224993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.225124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.225226 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.328479 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.328512 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.328521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.328535 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.328544 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.431578 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.431618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.431630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.431647 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.431659 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.534737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.534817 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.534832 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.534855 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.534871 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.638244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.638294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.638312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.638335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.638353 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.741919 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.741975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.741992 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.742015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.742032 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.844893 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.844972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.844996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.845029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.845050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.870614 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:32 crc kubenswrapper[4774]: E1001 13:38:32.870761 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.947486 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.947531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.947544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.947559 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:32 crc kubenswrapper[4774]: I1001 13:38:32.947568 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:32Z","lastTransitionTime":"2025-10-01T13:38:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.050565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.050638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.050660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.050684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.050704 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.153923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.153984 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.154006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.154035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.154059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.257093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.257172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.257194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.257221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.257241 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.361402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.361505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.361530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.361560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.361582 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.465050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.465130 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.465157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.465183 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.465201 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.567973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.568016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.568028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.568046 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.568059 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.670903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.670957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.670974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.670997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.671019 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.773970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.774031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.774053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.774082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.774103 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.855204 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855318 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:37.855297371 +0000 UTC m=+149.744927978 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.855361 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.855390 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.855418 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.855478 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855543 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855562 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855574 4774 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855575 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855591 4774 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855600 4774 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855606 4774 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855625 4774 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855628 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:39:37.855614889 +0000 UTC m=+149.745245486 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855698 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:39:37.85567922 +0000 UTC m=+149.745309827 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855725 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:39:37.855716221 +0000 UTC m=+149.745346828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.855747 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:39:37.855739372 +0000 UTC m=+149.745369979 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.870264 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.870290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.870385 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.870551 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.870668 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:33 crc kubenswrapper[4774]: E1001 13:38:33.870794 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.875890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.875920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.875930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.876008 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.876021 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.978770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.978818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.978830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.978846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:33 crc kubenswrapper[4774]: I1001 13:38:33.978858 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:33Z","lastTransitionTime":"2025-10-01T13:38:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.081010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.081074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.081093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.081116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.081133 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.182597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.182652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.182668 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.182689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.182707 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.285956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.286017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.286035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.286054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.286067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.389618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.389688 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.389705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.389729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.389747 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.492462 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.492501 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.492513 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.492528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.492540 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.596032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.596095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.596111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.596135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.596193 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.699899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.699989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.700018 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.700049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.700070 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.803480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.803539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.803565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.803592 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.803609 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.870433 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:34 crc kubenswrapper[4774]: E1001 13:38:34.871261 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.872620 4774 scope.go:117] "RemoveContainer" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.893517 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.906727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.906763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.906776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.906792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:34 crc kubenswrapper[4774]: I1001 13:38:34.906804 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:34Z","lastTransitionTime":"2025-10-01T13:38:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.010280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.010335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.010353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.010377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.010394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.120120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.120174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.120191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.120259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.120314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.224930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.224993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.225011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.225036 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.225054 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.327945 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.328236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.328422 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.328530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.328663 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.431611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.431681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.431697 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.431756 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.431774 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.534838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.535189 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.535346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.535525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.535761 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.639095 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.639172 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.639245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.639284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.639306 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.742762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.743411 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.743747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.744002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.744246 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.847309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.847637 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.847650 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.847673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.847685 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.869826 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.869875 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:35 crc kubenswrapper[4774]: E1001 13:38:35.869974 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.870006 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:35 crc kubenswrapper[4774]: E1001 13:38:35.870087 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:35 crc kubenswrapper[4774]: E1001 13:38:35.870176 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.949990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.950040 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.950058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.950080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:35 crc kubenswrapper[4774]: I1001 13:38:35.950096 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:35Z","lastTransitionTime":"2025-10-01T13:38:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.053813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.053881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.053908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.053937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.053957 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.157762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.157868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.157892 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.157925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.157951 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.261796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.261878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.261906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.261940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.261965 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.365599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.365653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.365667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.365687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.365702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.404528 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/2.log" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.408422 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.409102 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.429750 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.449740 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.463329 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.467888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.467937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.467946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.467960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.467970 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.486249 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.507695 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.518295 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.536559 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.550479 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.570061 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.570995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.571105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.571171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.571214 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.571256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.589485 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.603573 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.629102 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.645489 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.664366 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.674683 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.674747 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.674766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.674793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.674812 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.683108 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.698509 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.713758 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.727779 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.741096 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.777259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.777322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.777339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.777367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.777388 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.870741 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:36 crc kubenswrapper[4774]: E1001 13:38:36.871115 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.879725 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.879780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.879799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.879824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.879847 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.982240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.982283 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.982295 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.982310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:36 crc kubenswrapper[4774]: I1001 13:38:36.982321 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:36Z","lastTransitionTime":"2025-10-01T13:38:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.085101 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.085164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.085181 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.085203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.085219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.188640 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.188695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.188712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.188733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.188751 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.291879 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.291930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.291947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.291968 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.291987 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.394821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.394859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.394871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.394885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.394896 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.414859 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/3.log" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.415756 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/2.log" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.419211 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" exitCode=1 Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.419262 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.419314 4774 scope.go:117] "RemoveContainer" containerID="79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.420609 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:38:37 crc kubenswrapper[4774]: E1001 13:38:37.420926 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.453051 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.475272 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.494136 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.497766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.497819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.497859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.497896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.497920 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.518498 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.539551 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.555004 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.571500 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.586494 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.600821 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.600854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.600863 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.600877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.600886 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.601766 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.622635 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.640952 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.654778 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.671235 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.686529 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.700753 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.704757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.704823 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.704852 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.704884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.704911 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.717552 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.732254 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.752886 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.775767 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79cc06c5dd31a80500a884578b35e51a621a1e9f954d3a425a56a419b9bd7529\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:05Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:05.922612 6457 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:05.922668 6457 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:38:05.922678 6457 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:38:05.922699 6457 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:05.922707 6457 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:05.922713 6457 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:05.922719 6457 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:38:05.922727 6457 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:38:05.922746 6457 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:38:05.922766 6457 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:38:05.922761 6457 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:38:05.922773 6457 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:38:05.922782 6457 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:38:05.922872 6457 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1001 13:38:05.922884 6457 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:36Z\\\",\\\"message\\\":\\\"eflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977370 6819 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.977526 6819 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977660 6819 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977851 6819 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.978108 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:36.978138 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:36.978144 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:36.978168 6819 factory.go:656] Stopping watch factory\\\\nI1001 13:38:36.978180 6819 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:38:36.978210 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:36.978215 6819 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.807328 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.807386 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.807395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.807409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.807420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.869638 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.869662 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:37 crc kubenswrapper[4774]: E1001 13:38:37.869742 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.869788 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:37 crc kubenswrapper[4774]: E1001 13:38:37.869969 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:37 crc kubenswrapper[4774]: E1001 13:38:37.870067 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.910212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.910267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.910291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.910319 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:37 crc kubenswrapper[4774]: I1001 13:38:37.910341 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:37Z","lastTransitionTime":"2025-10-01T13:38:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.012715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.012783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.012808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.012834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.012851 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.115255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.115679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.116053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.116417 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.116776 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.219654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.219712 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.219729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.219753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.219772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.322667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.322732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.322750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.322773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.322790 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.370397 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.370702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.370849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.371006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.371156 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.393744 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.399303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.399343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.399358 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.399377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.399394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.416487 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.420981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.421049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.421074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.421102 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.421123 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.424849 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/3.log" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.429741 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.430079 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.439428 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.443695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.443729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.443743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.443763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.443777 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.449052 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.461979 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.465618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.465667 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.465679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.465700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.465715 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.467317 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.480164 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.480394 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.481299 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.483074 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.483137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.483163 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.483194 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.483211 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.494230 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.522325 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:36Z\\\",\\\"message\\\":\\\"eflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977370 6819 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.977526 6819 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977660 6819 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977851 6819 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.978108 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:36.978138 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:36.978144 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:36.978168 6819 factory.go:656] Stopping watch factory\\\\nI1001 13:38:36.978180 6819 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:38:36.978210 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:36.978215 6819 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.538087 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.557836 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.578904 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.586230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.586293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.586320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.586353 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.586376 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.596081 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.614080 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.626989 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.647660 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.669259 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.689981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.690307 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.690482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.690636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.690761 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.690855 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.712770 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.730154 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.747741 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.764784 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.781761 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.797948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.798038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.798076 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.798105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.798128 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.870255 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:38 crc kubenswrapper[4774]: E1001 13:38:38.870492 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.889925 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.901098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.901161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.901180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.901205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.901223 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:38Z","lastTransitionTime":"2025-10-01T13:38:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.903967 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.925441 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.940325 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.956839 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:38 crc kubenswrapper[4774]: I1001 13:38:38.988708 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:38Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.002624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.002652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.002662 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.002677 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.002686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.007493 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.021032 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.036070 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.049221 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.059627 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.074555 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.095392 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.105925 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.106224 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.106428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.106655 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.106833 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.111104 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.126802 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.155111 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.196437 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.210162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.210511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.210652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.210788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.210906 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.222772 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:36Z\\\",\\\"message\\\":\\\"eflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977370 6819 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.977526 6819 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977660 6819 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977851 6819 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.978108 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:36.978138 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:36.978144 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:36.978168 6819 factory.go:656] Stopping watch factory\\\\nI1001 13:38:36.978180 6819 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:38:36.978210 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:36.978215 6819 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.238433 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:39Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.313517 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.313552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.313564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.313581 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.313595 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.416318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.416719 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.416846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.417000 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.417115 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.519602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.519904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.520004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.520099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.520201 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.622924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.622986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.623004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.623031 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.623049 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.726240 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.726347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.726368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.726389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.726405 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.829947 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.829994 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.830005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.830023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.830034 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.869797 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.869854 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.869814 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:39 crc kubenswrapper[4774]: E1001 13:38:39.869942 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:39 crc kubenswrapper[4774]: E1001 13:38:39.870084 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:39 crc kubenswrapper[4774]: E1001 13:38:39.870220 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.937150 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.937223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.937273 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.937291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:39 crc kubenswrapper[4774]: I1001 13:38:39.937304 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:39Z","lastTransitionTime":"2025-10-01T13:38:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.040113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.040162 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.040178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.040201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.040219 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.142372 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.142420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.142436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.142463 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.142514 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.245342 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.245443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.245500 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.245532 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.245554 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.347882 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.347943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.347972 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.348002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.348023 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.451481 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.451544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.451563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.451588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.451607 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.555002 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.555061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.555081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.555103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.555120 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.658679 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.658764 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.658793 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.658830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.658854 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.762590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.762682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.762707 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.762746 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.762770 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.866721 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.866799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.866814 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.866842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.866863 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.870173 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:40 crc kubenswrapper[4774]: E1001 13:38:40.870881 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.970748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.970791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.970803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.970818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:40 crc kubenswrapper[4774]: I1001 13:38:40.970829 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:40Z","lastTransitionTime":"2025-10-01T13:38:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.072444 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.072499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.072510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.072530 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.072595 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.174289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.174338 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.174354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.174375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.174390 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.276766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.276831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.276849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.276875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.276894 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.379842 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.379901 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.379918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.379939 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.379951 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.482809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.482890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.482913 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.482938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.482955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.585928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.586165 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.586184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.586228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.586243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.689881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.689952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.689975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.690003 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.690026 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.793482 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.793537 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.793554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.793576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.793593 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.869579 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.869630 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.869644 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:41 crc kubenswrapper[4774]: E1001 13:38:41.869757 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:41 crc kubenswrapper[4774]: E1001 13:38:41.869910 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:41 crc kubenswrapper[4774]: E1001 13:38:41.870071 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.896547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.896602 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.896619 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.896642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.896660 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.999702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.999775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:41 crc kubenswrapper[4774]: I1001 13:38:41.999792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:41.999815 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:41.999832 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:41Z","lastTransitionTime":"2025-10-01T13:38:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.102609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.102656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.102666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.102686 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.102712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.205416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.205556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.205609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.205642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.205661 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.308599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.308654 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.308676 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.308702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.308720 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.411028 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.411124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.411142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.411166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.411188 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.513534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.513701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.513723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.513745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.513760 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.615782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.615828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.615841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.615859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.615871 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.719197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.719251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.719268 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.719292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.719309 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.822266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.822597 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.822671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.822744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.822843 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.870143 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:42 crc kubenswrapper[4774]: E1001 13:38:42.870756 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.925974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.926019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.926032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.926054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:42 crc kubenswrapper[4774]: I1001 13:38:42.926068 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:42Z","lastTransitionTime":"2025-10-01T13:38:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.030093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.030154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.030171 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.030199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.030220 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.133123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.133196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.133212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.133236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.133253 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.235966 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.236322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.236505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.236670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.236808 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.339732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.339790 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.339808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.339909 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.339932 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.443128 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.443184 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.443201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.443225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.443243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.545806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.545863 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.545880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.545903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.545920 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.649439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.649533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.649551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.649573 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.649590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.752404 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.752524 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.752549 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.752580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.752600 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.855554 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.855611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.855627 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.855651 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.855669 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.869535 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.869572 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.869627 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:43 crc kubenswrapper[4774]: E1001 13:38:43.870234 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:43 crc kubenswrapper[4774]: E1001 13:38:43.870382 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:43 crc kubenswrapper[4774]: E1001 13:38:43.870027 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.958930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.958993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.959012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.959035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:43 crc kubenswrapper[4774]: I1001 13:38:43.959053 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:43Z","lastTransitionTime":"2025-10-01T13:38:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.062621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.062682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.062699 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.062722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.062739 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.166302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.166390 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.166410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.166435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.166481 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.269670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.269732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.269749 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.269775 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.269793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.373216 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.373288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.373311 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.373340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.373365 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.476005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.476510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.476710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.476873 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.477005 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.580716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.580755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.580766 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.580783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.580793 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.682618 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.682682 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.682704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.682733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.682754 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.785671 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.786050 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.786251 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.786426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.786659 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.870360 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:44 crc kubenswrapper[4774]: E1001 13:38:44.870584 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.888715 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.888783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.888808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.888838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.888859 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.992024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.992094 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.992112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.992136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:44 crc kubenswrapper[4774]: I1001 13:38:44.992155 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:44Z","lastTransitionTime":"2025-10-01T13:38:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.095113 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.095164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.095176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.095195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.095212 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.197331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.197420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.197447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.197518 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.197537 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.300310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.300355 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.300366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.300383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.300397 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.403331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.403388 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.403405 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.403428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.403446 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.505616 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.505660 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.505672 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.505689 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.505702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.608828 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.608877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.608897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.608920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.608937 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.711938 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.712009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.712053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.712078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.712096 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.815203 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.815262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.815279 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.815302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.815319 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.870028 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.870080 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.870061 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:45 crc kubenswrapper[4774]: E1001 13:38:45.870210 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:45 crc kubenswrapper[4774]: E1001 13:38:45.870344 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:45 crc kubenswrapper[4774]: E1001 13:38:45.870509 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.917963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.918039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.918059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.918087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:45 crc kubenswrapper[4774]: I1001 13:38:45.918108 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:45Z","lastTransitionTime":"2025-10-01T13:38:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.022055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.022122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.022143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.022174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.022196 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.125642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.125722 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.125744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.125774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.125795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.228088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.228167 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.228191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.228221 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.228243 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.330148 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.330191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.330205 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.330227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.330242 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.433160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.433220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.433238 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.433266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.433282 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.536797 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.536859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.536880 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.536904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.536922 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.640590 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.640669 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.640695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.640726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.640750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.742952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.743001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.743012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.743029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.743040 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.845547 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.845601 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.845617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.845639 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.845654 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.870438 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:46 crc kubenswrapper[4774]: E1001 13:38:46.870596 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.949071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.949121 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.949137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.949158 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:46 crc kubenswrapper[4774]: I1001 13:38:46.949173 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:46Z","lastTransitionTime":"2025-10-01T13:38:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.051805 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.051886 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.051912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.051941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.051963 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.155012 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.155083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.155103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.155129 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.155160 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.257636 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.257701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.257723 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.257753 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.257776 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.361206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.361259 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.361270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.361289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.361300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.463262 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.463326 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.463343 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.463368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.463384 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.567122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.567197 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.567223 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.567257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.567278 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.669977 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.670055 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.670073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.670099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.670117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.773072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.773193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.773212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.773237 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.773257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.870188 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.870187 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.870419 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:47 crc kubenswrapper[4774]: E1001 13:38:47.870354 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:47 crc kubenswrapper[4774]: E1001 13:38:47.870607 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:47 crc kubenswrapper[4774]: E1001 13:38:47.870686 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.876899 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.876953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.876969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.876996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.877015 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.980841 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.980912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.980930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.980956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:47 crc kubenswrapper[4774]: I1001 13:38:47.980973 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:47Z","lastTransitionTime":"2025-10-01T13:38:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.083674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.083738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.083761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.083792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.083816 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.186870 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.186941 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.186959 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.186985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.187006 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.290235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.290293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.290310 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.290334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.290350 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.398874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.398937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.398955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.398980 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.398999 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.501630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.501681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.501693 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.501711 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.501725 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.604854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.604931 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.604954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.604981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.605002 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.707290 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.707340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.707350 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.707363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.707373 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.810684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.810728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.810739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.810779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.810791 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.859888 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.859936 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.859955 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.859976 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.859991 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.870330 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.870543 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.880767 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.886543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.886617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.886642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.886861 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.886890 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.887657 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.903946 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.907578 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.911759 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.911835 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.911850 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.911867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.911879 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.922155 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.930732 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.934628 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.934663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.934674 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.934690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.934702 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.939090 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.949047 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.951209 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.958335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.958367 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.958377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.958393 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.958406 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.962219 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.971393 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: E1001 13:38:48.971579 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.973642 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.973692 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.973709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.973733 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.973750 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:48Z","lastTransitionTime":"2025-10-01T13:38:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.978757 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:48 crc kubenswrapper[4774]: I1001 13:38:48.996551 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:48Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.020782 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.036417 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.055500 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.076166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.076213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.076227 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.076245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.076257 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.081900 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:36Z\\\",\\\"message\\\":\\\"eflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977370 6819 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.977526 6819 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977660 6819 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977851 6819 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.978108 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:36.978138 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:36.978144 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:36.978168 6819 factory.go:656] Stopping watch factory\\\\nI1001 13:38:36.978180 6819 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:38:36.978210 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:36.978215 6819 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.093266 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.112408 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.128821 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.149402 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.165454 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.177778 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.178928 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.178954 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.178963 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.178978 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.178988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.201653 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:49Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.281709 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.281748 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.281760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.281776 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.281788 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.384785 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.385200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.385398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.385593 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.385739 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.488974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.489041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.489065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.489096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.489117 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.591492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.591545 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.591564 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.591587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.591602 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.694143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.694231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.694248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.694302 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.694318 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.796914 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.796975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.796988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.797005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.797016 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.833938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.834234 4774 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.834354 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs podName:67555194-dc73-4f0a-bd6e-1ae0a010067a nodeName:}" failed. No retries permitted until 2025-10-01 13:39:53.834326936 +0000 UTC m=+165.723957563 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs") pod "network-metrics-daemon-hgfsz" (UID: "67555194-dc73-4f0a-bd6e-1ae0a010067a") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.870147 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.870190 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.870159 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.870368 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.870532 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.870684 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.871590 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:38:49 crc kubenswrapper[4774]: E1001 13:38:49.871771 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.899416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.899475 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.899489 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.899509 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:49 crc kubenswrapper[4774]: I1001 13:38:49.899523 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:49Z","lastTransitionTime":"2025-10-01T13:38:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.002802 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.003035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.003099 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.003160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.003231 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.106543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.106625 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.106665 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.106704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.106727 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.210201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.210879 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.211091 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.211394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.211639 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.315057 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.315105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.315121 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.315144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.315163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.418022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.418157 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.418176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.418198 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.418217 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.521425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.521536 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.521556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.521580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.521596 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.624621 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.624716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.624743 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.624774 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.624795 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.727278 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.727363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.727387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.727410 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.727427 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.830425 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.830522 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.830541 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.830586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.830610 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.869719 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:50 crc kubenswrapper[4774]: E1001 13:38:50.869920 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.933303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.933340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.933352 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.933368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:50 crc kubenswrapper[4774]: I1001 13:38:50.933383 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:50Z","lastTransitionTime":"2025-10-01T13:38:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.037421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.037540 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.037561 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.037591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.037611 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.143878 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.143930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.143943 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.143961 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.143976 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.250744 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.250809 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.250830 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.250857 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.250881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.353556 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.353631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.353646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.353666 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.353681 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.456085 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.456135 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.456146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.456164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.456176 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.558786 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.558837 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.558849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.558867 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.558881 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.661033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.661067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.661075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.661086 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.661096 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.764293 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.764370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.764394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.764423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.764440 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.867434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.867544 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.867560 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.867582 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.867600 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.869777 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.869838 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:51 crc kubenswrapper[4774]: E1001 13:38:51.869925 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:51 crc kubenswrapper[4774]: E1001 13:38:51.870108 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.869775 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:51 crc kubenswrapper[4774]: E1001 13:38:51.870275 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.971400 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.971543 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.971568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.971599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:51 crc kubenswrapper[4774]: I1001 13:38:51.971622 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:51Z","lastTransitionTime":"2025-10-01T13:38:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.074727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.074779 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.074796 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.074819 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.074838 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.178370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.178503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.178531 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.178563 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.178585 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.280958 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.280998 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.281009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.281063 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.281091 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.383680 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.383729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.383745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.383768 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.383784 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.487037 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.487115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.487133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.487161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.487179 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.590755 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.590827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.590851 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.590885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.590906 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.694257 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.694322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.694339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.694364 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.694382 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.796849 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.796906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.796924 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.796946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.796962 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.872689 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:52 crc kubenswrapper[4774]: E1001 13:38:52.872922 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.899195 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.899258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.899277 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.899300 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:52 crc kubenswrapper[4774]: I1001 13:38:52.899318 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:52Z","lastTransitionTime":"2025-10-01T13:38:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.002292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.002368 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.002392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.002421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.002445 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.105702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.105773 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.105791 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.105818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.105840 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.208360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.208419 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.208435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.208484 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.208502 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.311528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.311591 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.311608 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.311632 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.311650 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.414705 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.414761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.414778 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.414803 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.414820 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.517681 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.517742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.517760 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.517782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.517800 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.620595 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.620656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.620673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.620698 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.620715 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.723895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.723937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.723946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.723962 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.723971 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.826831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.826906 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.826923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.826946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.826963 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.869941 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.870031 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.870057 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:53 crc kubenswrapper[4774]: E1001 13:38:53.870236 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:53 crc kubenswrapper[4774]: E1001 13:38:53.870381 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:53 crc kubenswrapper[4774]: E1001 13:38:53.870470 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.929447 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.929520 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.929534 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.929553 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:53 crc kubenswrapper[4774]: I1001 13:38:53.929568 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:53Z","lastTransitionTime":"2025-10-01T13:38:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.032387 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.032448 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.032499 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.032526 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.032544 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.135948 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.136005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.136019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.136038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.136051 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.239423 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.239510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.239529 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.239577 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.239590 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.342133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.342176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.342188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.342204 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.342215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.446033 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.446098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.446115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.446137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.446157 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.549245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.549327 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.549347 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.549373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.549391 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.652515 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.652572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.652588 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.652617 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.652640 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.756757 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.756818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.756836 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.756860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.756877 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.860004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.860093 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.860119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.860154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.860177 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.869690 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:54 crc kubenswrapper[4774]: E1001 13:38:54.869864 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.963260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.963308 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.963324 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.963348 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:54 crc kubenswrapper[4774]: I1001 13:38:54.963365 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:54Z","lastTransitionTime":"2025-10-01T13:38:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.065902 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.065987 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.066010 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.066048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.066067 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.168235 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.168291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.168309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.168331 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.168348 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.271685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.271762 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.271780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.271807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.271826 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.374631 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.374675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.374690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.374704 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.374714 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.476889 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.476932 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.476944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.476973 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.476988 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.579934 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.579988 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.580004 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.580027 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.580043 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.683206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.683269 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.683291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.683320 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.683341 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.785956 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.786016 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.786032 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.786054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.786074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.870298 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.870368 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:55 crc kubenswrapper[4774]: E1001 13:38:55.870510 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.870551 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:55 crc kubenswrapper[4774]: E1001 13:38:55.870760 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:55 crc kubenswrapper[4774]: E1001 13:38:55.870936 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.888871 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.889159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.889206 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.889228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.889245 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.992005 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.992080 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.992103 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.992134 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:55 crc kubenswrapper[4774]: I1001 13:38:55.992159 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:55Z","lastTransitionTime":"2025-10-01T13:38:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.095576 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.095646 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.095664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.095684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.095697 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.198111 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.198152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.198164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.198179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.198190 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.300957 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.301009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.301022 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.301038 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.301050 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.403910 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.404006 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.404017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.404029 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.404037 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.507337 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.507426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.507446 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.507511 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.507530 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.610274 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.610335 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.610345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.610360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.610371 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.713140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.713248 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.713272 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.713303 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.713324 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.816144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.816187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.816196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.816212 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.816221 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.870029 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:56 crc kubenswrapper[4774]: E1001 13:38:56.870512 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.919443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.919495 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.919505 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.919519 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:56 crc kubenswrapper[4774]: I1001 13:38:56.919528 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:56Z","lastTransitionTime":"2025-10-01T13:38:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.022261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.022334 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.022354 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.022377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.022444 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.124492 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.124525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.124533 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.124565 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.124574 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.227735 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.227806 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.227818 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.227838 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.227850 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.329716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.329782 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.329799 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.329827 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.329844 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.432122 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.432177 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.432199 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.432229 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.432249 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.534989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.535049 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.535062 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.535097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.535111 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.637598 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.637687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.637708 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.637732 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.637794 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.740346 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.740415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.740430 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.740471 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.740482 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.844236 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.844322 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.844345 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.844375 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.844394 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.869770 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.869822 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.870030 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:57 crc kubenswrapper[4774]: E1001 13:38:57.870158 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:57 crc kubenswrapper[4774]: E1001 13:38:57.870307 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:57 crc kubenswrapper[4774]: E1001 13:38:57.870530 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.947684 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.947745 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.947763 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.947788 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:57 crc kubenswrapper[4774]: I1001 13:38:57.947805 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:57Z","lastTransitionTime":"2025-10-01T13:38:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.051380 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.051441 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.051469 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.051485 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.051495 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.154140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.154218 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.154231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.154250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.154263 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.256986 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.257052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.257067 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.257083 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.257094 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.360180 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.360241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.360260 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.360284 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.360301 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.463143 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.463193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.463210 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.463228 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.463240 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.565734 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.565813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.565843 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.565874 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.565895 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.671742 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.671812 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.671856 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.671890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.671915 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.774700 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.774770 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.774783 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.774807 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.774823 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.870344 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:38:58 crc kubenswrapper[4774]: E1001 13:38:58.870676 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.876985 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.877054 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.877112 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.877144 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.877167 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.902034 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e3ee3cb3-6187-468f-9b58-60a18ef2da67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:36Z\\\",\\\"message\\\":\\\"eflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977370 6819 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.977526 6819 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977660 6819 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:38:36.977851 6819 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:38:36.978108 6819 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:38:36.978138 6819 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:38:36.978144 6819 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:38:36.978168 6819 factory.go:656] Stopping watch factory\\\\nI1001 13:38:36.978180 6819 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:38:36.978210 6819 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:38:36.978215 6819 metrics.go:553] Stopping metrics server at address \\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:38:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8t7th\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-v7jfr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.919056 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41bbb9e7-529f-459e-8443-e8f75a6f1085\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ed7f80d2080ee012057d438e1fb37b7aaec85326c7244cad36c8db1d056eaaa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fbfccc0fb02516f07169fd24a6c5e99362e914582513dfb4a624ca493ca2486\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.937916 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99ca8662-f7aa-49f8-be72-e8daa8d4f00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ec610603f9ee38434ea52c9d250c5e5974c1e9b615a1af791a264af58a9816d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fce865418dcb8d8a960dea0c072aa841fee3aab44e2178e8f3c82499dee900fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://57269b54104e043d556655fb079e4b8e60669fc29064c7d18490aa2bf626991a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2be8d61e0b6f103a87b4a0dac1ef3f3ad8c3cff9bbe1ba41545a060c210f11d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.955051 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.972564 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65006f902bc23d438b77b445a4c17832597e711ccf54ee0bfc499b2830e9fcaa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://edc9ef5ad225410d63182a701f6cf150fef22bbcea8a8b9f2aacefa7cc2681c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.980193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.980231 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.980244 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.980261 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.980274 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:58Z","lastTransitionTime":"2025-10-01T13:38:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:58 crc kubenswrapper[4774]: I1001 13:38:58.988637 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8svls" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:38:22Z\\\",\\\"message\\\":\\\"2025-10-01T13:37:36+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7\\\\n2025-10-01T13:37:36+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_de99e4c4-2298-4a23-be56-6892275f18e7 to /host/opt/cni/bin/\\\\n2025-10-01T13:37:37Z [verbose] multus-daemon started\\\\n2025-10-01T13:37:37Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:38:22Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:38:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kmmmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8svls\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:58Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.003110 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-96g6w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"843edfe0-a47c-4ef9-9ec3-938d1605d348\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a484a2f27edc6414e8285b684b9abc6ea87e164cc11cc407244557b858b1393\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m555q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-96g6w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.033931 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"caab59aa-3a68-4254-b70c-8dd65e70bb06\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a8dbee4a5d8ce7fffe47871052bd661bb4d54b300755e97f918d01c08d0b2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a66683806197168c8555e9d34b6bef040276d997c74d4bba58b09c6caf5ea0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5fd3daaad61556f4ebb6d77538083c949c189ce549f2517ad821770b1bb189a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://729ff2db66d7587b144d707fbf8563b3cb68fc9c223a5759b8161ecd978b45c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b475b057cf9c471a6adf27877773ca7ac4294f478df63a0d687b15eca175415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59672e9bd04df9c232196afcffe5342c8cf03e1eadfc9fb4b02141bca86b00e5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bf08b7aa04540dc206da761969d822384c16b21345d5ca99868e0e2f7fd8828\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ae2659130de3fe463f7405787e53ec741c2eda0369c2fcdcca3f08a2178a082\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.052320 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://904ecc372b8464bb0b2d413c8b888a71a6fe2ccae8afdd28934875de0e3a236b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.070897 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.082995 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.083075 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.083089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.083133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.083148 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.094660 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6c2cbe4-dd67-4f5c-8f47-3d8986219793\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db4d6ce190737c7a8571acca6df22ad5f0294fe67a26a7da30762740f4740c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://138c3577cb255d079cd8f6335a4f5d0e2a134ded1e686d2cf0fef1fb1265aece\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc0c2ef45b4d8b53c737138a9b8826e8e197d66e1c86414ebb4bac740f5597cf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c1c3ef718a6de38a3fa647705cbf965a9c286c4dfbbd1c577db68aeb5b58c53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5e14b3e7ba0588b6d623335d1d845321610b0145473fd9fffcfb2f782e14eb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ec6722b78dd85777c54e0857afee33861f8250e9c2720a66f0c2464f11fbabe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://46ff1d3fef5500f68e0bf44039af2e927ecefd191cef2f729c7f93f3f45f8a6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhsrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:32Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-h5t2l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.109040 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55a90eeb-9a46-4083-9c5e-4313773da697\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b912a2aa01aa8c4e85869b83193edd664763e0b3924091a0008f8367a307ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49db8d593605085ca8894e505155104dfa4c465e0b8c26e1907bb4ff2123ab07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9k75h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:44Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hvkld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.124802 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"354b6ae5-f763-4498-b217-d968f8054589\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2643f3c6ae0129a1e6f88dc04950afbfea0a0e8dd79fe8c939be176f09692b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c218ea6284101ebd2c49f1d74c85ce5d0dcb12ce52d1929e8a714f84afe8ad2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4be8bce9498211dca7cdc4533d1d2ef9d9d18249fb561f36572ae142cff1bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1db66f296cb803bfb3363e6828175f1d85721a7f68b7a5006f5fe879d62c82f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.138984 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd8df17fcebee3870392a88394464f9e651015ff3911992ff5da3bc9da1c5cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.153377 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67555194-dc73-4f0a-bd6e-1ae0a010067a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l88m2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-hgfsz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.173416 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d500d858-8875-4743-8f10-38f91b5f7e28\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1affc7224e4c6bc9820a5a7617ca6a093c0afdf848583c9465c1cec1dc90891\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cc99fd872c8e3301d19ef451e588294c1343ccb90dbc0855952857586cdf1ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ecc1c28040852e9d11efe729ff2e9ff582af936c611e5ee3e5b585dd10373c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0796b25c9ea32f13168d63ce6f2da2b40fe2b03cdc2320f2564a72410fc06d39\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f1ea67ca86e8a7f3ae4f9636a74a72ec2cb8cbff3cd273887b246b50fda87e9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:37:13Z\\\",\\\"message\\\":\\\"W1001 13:37:13.164090 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:37:13.164476 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759325833 cert, and key in /tmp/serving-cert-1596716916/serving-signer.crt, /tmp/serving-cert-1596716916/serving-signer.key\\\\nI1001 13:37:13.456508 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:37:13.460477 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:37:13.460638 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:37:13.461422 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1596716916/tls.crt::/tmp/serving-cert-1596716916/tls.key\\\\\\\"\\\\nF1001 13:37:13.702898 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2da38a1d1ede717dd9a0803282f5b38b0b46953b6a6d44d66d6c6f05dccef03e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:12Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af15878781f453cd49d340bc38465e6f4ea980d5703a10dbaf291a3713df0df5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:37:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:37:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.185904 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.185940 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.185952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.185969 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.185982 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.190825 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.204138 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-679cg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8660d853-6a54-48f4-a6fd-275176a4bf1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3641f8b7ae6047a60aece6219a374af23051a78a01c354341f8eac56a208ea54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zcfkx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-679cg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.216316 4774 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18618ab0-7244-42b3-9ccd-60661c89c742\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:37:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f9e53759e81a9b360b89d1077b9d3de49c8c1f0c459476295ef2e35040510d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:37:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjmqb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:37:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-74ttd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.289013 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.289107 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.289124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.289146 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.289162 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.330267 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.330377 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.330394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.330416 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.330433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.350831 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.355643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.355687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.355702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.355726 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.355744 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.374858 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.379942 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.379997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.380021 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.380048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.380069 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.399698 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.404015 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.404097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.404116 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.404174 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.404198 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.429386 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.435609 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.435673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.435695 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.435724 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.435745 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.458123 4774 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:38:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"43812622-110d-4c9c-94ff-65b8a298322f\\\",\\\"systemUUID\\\":\\\"75fe681e-c594-4ab2-ad84-cd261c47a27a\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:38:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.458381 4774 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.460765 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.460834 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.460854 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.460881 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.460900 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.564048 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.564118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.564133 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.564151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.564185 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.667120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.667196 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.667213 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.667239 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.667256 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.770488 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.770579 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.770599 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.770624 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.770645 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.869637 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.869787 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.870024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.870131 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.870332 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:38:59 crc kubenswrapper[4774]: E1001 13:38:59.870428 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.874188 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.874241 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.874258 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.874280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.874297 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.977891 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.978011 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.978071 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.978097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:38:59 crc kubenswrapper[4774]: I1001 13:38:59.978171 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:38:59Z","lastTransitionTime":"2025-10-01T13:38:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.081859 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.081897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.081908 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.081923 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.081935 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.184824 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.185176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.185370 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.185570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.185745 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.289035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.289096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.289114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.289140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.289160 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.392066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.392137 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.392154 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.392178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.392197 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.495975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.496042 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.496060 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.496084 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.496102 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.599528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.599587 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.599607 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.599630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.599649 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.701983 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.702024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.702039 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.702059 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.702074 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.805309 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.805383 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.805408 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.805436 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.805485 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.870198 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:00 crc kubenswrapper[4774]: E1001 13:39:00.870649 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.908575 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.908649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.908670 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.908701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:00 crc kubenswrapper[4774]: I1001 13:39:00.908725 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:00Z","lastTransitionTime":"2025-10-01T13:39:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.011829 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.011895 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.011912 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.011937 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.011955 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.115187 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.115250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.115270 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.115294 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.115313 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.217381 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.217421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.217434 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.217480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.217497 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.319868 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.319989 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.320001 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.320020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.320033 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.423584 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.423652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.423685 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.423701 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.423712 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.525903 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.525944 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.525953 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.525974 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.525983 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.628567 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.628615 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.628633 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.628656 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.628675 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.730907 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.730979 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.730997 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.731020 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.731038 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.833052 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.833118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.833140 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.833161 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.833176 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.869951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.870004 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.870013 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:01 crc kubenswrapper[4774]: E1001 13:39:01.870107 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:01 crc kubenswrapper[4774]: E1001 13:39:01.870390 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:01 crc kubenswrapper[4774]: E1001 13:39:01.870697 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.935421 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.935510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.935528 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.935551 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:01 crc kubenswrapper[4774]: I1001 13:39:01.935569 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:01Z","lastTransitionTime":"2025-10-01T13:39:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.037716 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.037772 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.037792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.037820 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.037842 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.140291 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.140340 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.140362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.140389 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.140412 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.242629 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.242706 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.242728 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.242751 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.242769 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.345415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.345594 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.345615 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.345638 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.345655 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.448663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.448713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.448730 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.448752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.448768 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.552185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.552250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.552266 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.552288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.552304 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.655402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.655493 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.655521 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.655550 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.655573 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.758255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.758336 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.758362 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.758396 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.758420 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.861043 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.861079 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.861089 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.861104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.861115 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.870660 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:02 crc kubenswrapper[4774]: E1001 13:39:02.870911 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.964061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.964115 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.964132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.964153 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:02 crc kubenswrapper[4774]: I1001 13:39:02.964170 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:02Z","lastTransitionTime":"2025-10-01T13:39:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.067872 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.067975 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.067993 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.068019 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.068039 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.171145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.171208 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.171225 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.171252 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.171269 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.274649 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.274702 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.274714 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.274729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.274740 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.377847 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.377875 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.377884 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.377897 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.377906 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.481652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.481713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.481729 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.481754 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.481772 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.584801 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.584896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.584920 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.584951 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.585001 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.688752 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.688787 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.688798 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.688813 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.688823 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.792159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.792233 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.792255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.792281 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.792300 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.869809 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.869880 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.869900 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:03 crc kubenswrapper[4774]: E1001 13:39:03.870012 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:03 crc kubenswrapper[4774]: E1001 13:39:03.870183 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:03 crc kubenswrapper[4774]: E1001 13:39:03.870276 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.870985 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:39:03 crc kubenswrapper[4774]: E1001 13:39:03.871152 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.895053 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.895097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.895105 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.895119 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.895130 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.997339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.997398 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.997415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.997443 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:03 crc kubenswrapper[4774]: I1001 13:39:03.997504 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:03Z","lastTransitionTime":"2025-10-01T13:39:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.100761 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.100877 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.100896 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.100918 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.100934 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.203946 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.204023 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.204041 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.204065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.204085 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.306179 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.306220 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.306230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.306246 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.306258 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.409635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.410066 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.410242 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.410403 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.410630 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.513687 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.514437 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.514622 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.514860 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.515046 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.617739 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.617792 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.617808 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.617831 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.617848 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.720574 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.720643 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.720663 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.720690 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.720709 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.823024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.823088 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.823104 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.823156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.823176 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.870330 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:04 crc kubenswrapper[4774]: E1001 13:39:04.870593 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.926424 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.926538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.926562 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.926586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:04 crc kubenswrapper[4774]: I1001 13:39:04.926606 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:04Z","lastTransitionTime":"2025-10-01T13:39:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.030065 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.030166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.030200 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.030245 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.030272 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.133718 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.133795 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.133816 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.133846 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.133868 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.237344 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.237420 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.237440 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.237503 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.237526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.341570 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.341652 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.341673 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.341710 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.341731 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.445166 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.445263 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.445280 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.445312 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.445332 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.548306 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.548373 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.548394 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.548415 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.548433 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.652047 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.652132 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.652151 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.652175 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.652192 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.755193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.755289 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.755339 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.755366 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.755416 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.858073 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.858141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.858159 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.858185 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.858209 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.869532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.869646 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:05 crc kubenswrapper[4774]: E1001 13:39:05.869709 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.869532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:05 crc kubenswrapper[4774]: E1001 13:39:05.869853 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:05 crc kubenswrapper[4774]: E1001 13:39:05.869978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.961024 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.961097 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.961114 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.961142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:05 crc kubenswrapper[4774]: I1001 13:39:05.961201 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:05Z","lastTransitionTime":"2025-10-01T13:39:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.064996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.065058 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.065078 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.065118 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.065137 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.168538 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.168612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.168635 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.168664 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.168686 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.272490 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.272568 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.272586 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.272611 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.272638 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.377082 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.377156 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.377176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.377207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.377228 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.480583 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.480634 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.480653 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.480675 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.480694 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.583784 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.583866 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.583885 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.583916 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.583934 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.687428 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.687552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.687580 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.687612 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.687629 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.790077 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.790124 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.790136 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.790152 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.790163 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.870442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:06 crc kubenswrapper[4774]: E1001 13:39:06.870703 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.892145 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.892230 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.892255 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.892288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.892319 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.995170 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.995250 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.995264 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.995292 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:06 crc kubenswrapper[4774]: I1001 13:39:06.995314 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:06Z","lastTransitionTime":"2025-10-01T13:39:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.098630 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.098727 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.098750 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.098780 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.098802 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.201392 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.201465 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.201480 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.201498 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.201511 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.304439 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.304539 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.304552 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.304572 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.304583 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.407626 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.407694 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.407713 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.407738 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.407759 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.511178 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.511287 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.511359 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.511395 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.511666 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.614930 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.614990 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.615009 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.615035 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.615052 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.718429 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.718497 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.718507 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.718525 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.718536 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.822141 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.822191 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.822207 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.822232 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.822249 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.870269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.870392 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.870269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:07 crc kubenswrapper[4774]: E1001 13:39:07.870447 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:07 crc kubenswrapper[4774]: E1001 13:39:07.870836 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:07 crc kubenswrapper[4774]: E1001 13:39:07.870983 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.925330 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.925409 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.925435 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.925506 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:07 crc kubenswrapper[4774]: I1001 13:39:07.925528 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:07Z","lastTransitionTime":"2025-10-01T13:39:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.029100 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.029160 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.029176 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.029201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.029218 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.132017 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.132081 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.132098 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.132120 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.132137 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.235296 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.235360 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.235378 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.235402 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.235423 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.338363 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.338426 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.338445 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.338510 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.338526 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.441960 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.442025 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.442061 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.442092 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.442121 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.544981 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.545072 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.545096 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.545123 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.545144 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.647890 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.647952 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.647970 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.647996 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.648014 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.751201 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.751265 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.751288 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.751318 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.751343 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:08Z","lastTransitionTime":"2025-10-01T13:39:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:08 crc kubenswrapper[4774]: E1001 13:39:08.851666 4774 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.869857 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:08 crc kubenswrapper[4774]: E1001 13:39:08.870028 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.990977 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-h5t2l" podStartSLOduration=97.990954523 podStartE2EDuration="1m37.990954523s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:08.970486685 +0000 UTC m=+120.860117292" watchObservedRunningTime="2025-10-01 13:39:08.990954523 +0000 UTC m=+120.880585130" Oct 01 13:39:08 crc kubenswrapper[4774]: I1001 13:39:08.991646 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hvkld" podStartSLOduration=97.991640141 podStartE2EDuration="1m37.991640141s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:08.991515628 +0000 UTC m=+120.881146265" watchObservedRunningTime="2025-10-01 13:39:08.991640141 +0000 UTC m=+120.881270748" Oct 01 13:39:08 crc kubenswrapper[4774]: E1001 13:39:08.995317 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.009572 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-96g6w" podStartSLOduration=98.009546963 podStartE2EDuration="1m38.009546963s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.009510952 +0000 UTC m=+120.899141589" watchObservedRunningTime="2025-10-01 13:39:09.009546963 +0000 UTC m=+120.899177570" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.048500 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=35.048434615 podStartE2EDuration="35.048434615s" podCreationTimestamp="2025-10-01 13:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.048277571 +0000 UTC m=+120.937908188" watchObservedRunningTime="2025-10-01 13:39:09.048434615 +0000 UTC m=+120.938065252" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.106780 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.106758509 podStartE2EDuration="1m3.106758509s" podCreationTimestamp="2025-10-01 13:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.106694857 +0000 UTC m=+120.996325494" watchObservedRunningTime="2025-10-01 13:39:09.106758509 +0000 UTC m=+120.996389146" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.163931 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podStartSLOduration=99.163903392 podStartE2EDuration="1m39.163903392s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.163446831 +0000 UTC m=+121.053077518" watchObservedRunningTime="2025-10-01 13:39:09.163903392 +0000 UTC m=+121.053534039" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.164239 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-679cg" podStartSLOduration=99.164231221 podStartE2EDuration="1m39.164231221s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.146510504 +0000 UTC m=+121.036141151" watchObservedRunningTime="2025-10-01 13:39:09.164231221 +0000 UTC m=+121.053861858" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.220700 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=99.220670706 podStartE2EDuration="1m39.220670706s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.194282226 +0000 UTC m=+121.083912863" watchObservedRunningTime="2025-10-01 13:39:09.220670706 +0000 UTC m=+121.110301333" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.221301 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=99.221284152 podStartE2EDuration="1m39.221284152s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.219091575 +0000 UTC m=+121.108722242" watchObservedRunningTime="2025-10-01 13:39:09.221284152 +0000 UTC m=+121.110914829" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.292012 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8svls" podStartSLOduration=98.291980395 podStartE2EDuration="1m38.291980395s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.291608335 +0000 UTC m=+121.181238972" watchObservedRunningTime="2025-10-01 13:39:09.291980395 +0000 UTC m=+121.181611032" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.347318 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=53.347289771 podStartE2EDuration="53.347289771s" podCreationTimestamp="2025-10-01 13:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:09.346779608 +0000 UTC m=+121.236410245" watchObservedRunningTime="2025-10-01 13:39:09.347289771 +0000 UTC m=+121.236920398" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.544883 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/1.log" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.545695 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/0.log" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.545789 4774 generic.go:334] "Generic (PLEG): container finished" podID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" containerID="8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e" exitCode=1 Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.545848 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerDied","Data":"8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e"} Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.545924 4774 scope.go:117] "RemoveContainer" containerID="2896789f4c445ebe9f8658ff9d432f77256a5d6d63241fc8dcea3b6599b70155" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.546483 4774 scope.go:117] "RemoveContainer" containerID="8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e" Oct 01 13:39:09 crc kubenswrapper[4774]: E1001 13:39:09.546714 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8svls_openshift-multus(be8a0f8f-0098-4fa6-b4b2-ceda580f19b5)\"" pod="openshift-multus/multus-8svls" podUID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.656087 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.656142 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.656164 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.656193 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.656215 4774 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:39:09Z","lastTransitionTime":"2025-10-01T13:39:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.718827 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8"] Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.719365 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.725367 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.725584 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.725495 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.725549 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.868346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.868405 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e6b59b-c02d-4401-ad39-8b83c49348b4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.868442 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.868580 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e6b59b-c02d-4401-ad39-8b83c49348b4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.868678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86e6b59b-c02d-4401-ad39-8b83c49348b4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.870134 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.870230 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:09 crc kubenswrapper[4774]: E1001 13:39:09.870313 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.870148 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:09 crc kubenswrapper[4774]: E1001 13:39:09.870563 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:09 crc kubenswrapper[4774]: E1001 13:39:09.870736 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e6b59b-c02d-4401-ad39-8b83c49348b4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970425 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86e6b59b-c02d-4401-ad39-8b83c49348b4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970498 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970564 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e6b59b-c02d-4401-ad39-8b83c49348b4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970621 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970732 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.970754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/86e6b59b-c02d-4401-ad39-8b83c49348b4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.972020 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/86e6b59b-c02d-4401-ad39-8b83c49348b4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:09 crc kubenswrapper[4774]: I1001 13:39:09.978307 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86e6b59b-c02d-4401-ad39-8b83c49348b4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:09.999923 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/86e6b59b-c02d-4401-ad39-8b83c49348b4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bxgm8\" (UID: \"86e6b59b-c02d-4401-ad39-8b83c49348b4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:10.041308 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:10.555918 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" event={"ID":"86e6b59b-c02d-4401-ad39-8b83c49348b4","Type":"ContainerStarted","Data":"27922aac1422f8fd290f0fff5df820020ef7ef2fdeb4ec427a3799f693fddc8d"} Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:10.556240 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" event={"ID":"86e6b59b-c02d-4401-ad39-8b83c49348b4","Type":"ContainerStarted","Data":"0da7015767a9f64fa659998be967d0869e8a74367f9f27cc357570dbc7ee0d76"} Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:10.558903 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/1.log" Oct 01 13:39:10 crc kubenswrapper[4774]: I1001 13:39:10.870137 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:10 crc kubenswrapper[4774]: E1001 13:39:10.870370 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:11 crc kubenswrapper[4774]: I1001 13:39:11.869410 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:11 crc kubenswrapper[4774]: E1001 13:39:11.869571 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:11 crc kubenswrapper[4774]: I1001 13:39:11.870009 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:11 crc kubenswrapper[4774]: I1001 13:39:11.870083 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:11 crc kubenswrapper[4774]: E1001 13:39:11.870529 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:11 crc kubenswrapper[4774]: E1001 13:39:11.870320 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:12 crc kubenswrapper[4774]: I1001 13:39:12.870026 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:12 crc kubenswrapper[4774]: E1001 13:39:12.870191 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:13 crc kubenswrapper[4774]: I1001 13:39:13.869473 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:13 crc kubenswrapper[4774]: I1001 13:39:13.869504 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:13 crc kubenswrapper[4774]: I1001 13:39:13.869489 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:13 crc kubenswrapper[4774]: E1001 13:39:13.869618 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:13 crc kubenswrapper[4774]: E1001 13:39:13.869931 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:13 crc kubenswrapper[4774]: E1001 13:39:13.870013 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:13 crc kubenswrapper[4774]: E1001 13:39:13.996951 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:39:14 crc kubenswrapper[4774]: I1001 13:39:14.870209 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:14 crc kubenswrapper[4774]: E1001 13:39:14.870591 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:14 crc kubenswrapper[4774]: I1001 13:39:14.871662 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:39:14 crc kubenswrapper[4774]: E1001 13:39:14.871946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-v7jfr_openshift-ovn-kubernetes(e3ee3cb3-6187-468f-9b58-60a18ef2da67)\"" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" Oct 01 13:39:15 crc kubenswrapper[4774]: I1001 13:39:15.869429 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:15 crc kubenswrapper[4774]: I1001 13:39:15.869519 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:15 crc kubenswrapper[4774]: I1001 13:39:15.869493 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:15 crc kubenswrapper[4774]: E1001 13:39:15.869657 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:15 crc kubenswrapper[4774]: E1001 13:39:15.869758 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:15 crc kubenswrapper[4774]: E1001 13:39:15.869956 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:16 crc kubenswrapper[4774]: I1001 13:39:16.870241 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:16 crc kubenswrapper[4774]: E1001 13:39:16.870812 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:17 crc kubenswrapper[4774]: I1001 13:39:17.870408 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:17 crc kubenswrapper[4774]: I1001 13:39:17.870481 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:17 crc kubenswrapper[4774]: I1001 13:39:17.870522 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:17 crc kubenswrapper[4774]: E1001 13:39:17.870635 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:17 crc kubenswrapper[4774]: E1001 13:39:17.870791 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:17 crc kubenswrapper[4774]: E1001 13:39:17.870935 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:18 crc kubenswrapper[4774]: I1001 13:39:18.870039 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:18 crc kubenswrapper[4774]: E1001 13:39:18.871825 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:18 crc kubenswrapper[4774]: E1001 13:39:18.997871 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:39:19 crc kubenswrapper[4774]: I1001 13:39:19.869716 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:19 crc kubenswrapper[4774]: I1001 13:39:19.869753 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:19 crc kubenswrapper[4774]: E1001 13:39:19.870043 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:19 crc kubenswrapper[4774]: E1001 13:39:19.870166 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:19 crc kubenswrapper[4774]: I1001 13:39:19.870557 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:19 crc kubenswrapper[4774]: E1001 13:39:19.870721 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:20 crc kubenswrapper[4774]: I1001 13:39:20.870799 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:20 crc kubenswrapper[4774]: E1001 13:39:20.871003 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:21 crc kubenswrapper[4774]: I1001 13:39:21.869685 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:21 crc kubenswrapper[4774]: I1001 13:39:21.869742 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:21 crc kubenswrapper[4774]: I1001 13:39:21.869780 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:21 crc kubenswrapper[4774]: E1001 13:39:21.869818 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:21 crc kubenswrapper[4774]: E1001 13:39:21.869951 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:21 crc kubenswrapper[4774]: E1001 13:39:21.870142 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:22 crc kubenswrapper[4774]: I1001 13:39:22.872220 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:22 crc kubenswrapper[4774]: E1001 13:39:22.872575 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:22 crc kubenswrapper[4774]: I1001 13:39:22.873499 4774 scope.go:117] "RemoveContainer" containerID="8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e" Oct 01 13:39:22 crc kubenswrapper[4774]: I1001 13:39:22.913891 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bxgm8" podStartSLOduration=111.91386812 podStartE2EDuration="1m51.91386812s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:10.576577367 +0000 UTC m=+122.466208014" watchObservedRunningTime="2025-10-01 13:39:22.91386812 +0000 UTC m=+134.803498747" Oct 01 13:39:23 crc kubenswrapper[4774]: I1001 13:39:23.611218 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/1.log" Oct 01 13:39:23 crc kubenswrapper[4774]: I1001 13:39:23.611312 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerStarted","Data":"e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09"} Oct 01 13:39:23 crc kubenswrapper[4774]: I1001 13:39:23.870416 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:23 crc kubenswrapper[4774]: I1001 13:39:23.870531 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:23 crc kubenswrapper[4774]: I1001 13:39:23.870532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:23 crc kubenswrapper[4774]: E1001 13:39:23.870615 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:23 crc kubenswrapper[4774]: E1001 13:39:23.870694 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:23 crc kubenswrapper[4774]: E1001 13:39:23.870860 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:24 crc kubenswrapper[4774]: E1001 13:39:23.999909 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:39:24 crc kubenswrapper[4774]: I1001 13:39:24.869532 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:24 crc kubenswrapper[4774]: E1001 13:39:24.869710 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:25 crc kubenswrapper[4774]: I1001 13:39:25.869629 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:25 crc kubenswrapper[4774]: I1001 13:39:25.869700 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:25 crc kubenswrapper[4774]: I1001 13:39:25.869630 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:25 crc kubenswrapper[4774]: E1001 13:39:25.869839 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:25 crc kubenswrapper[4774]: E1001 13:39:25.869942 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:25 crc kubenswrapper[4774]: E1001 13:39:25.870295 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:26 crc kubenswrapper[4774]: I1001 13:39:26.870536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:26 crc kubenswrapper[4774]: E1001 13:39:26.870716 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:27 crc kubenswrapper[4774]: I1001 13:39:27.870310 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:27 crc kubenswrapper[4774]: I1001 13:39:27.870435 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:27 crc kubenswrapper[4774]: E1001 13:39:27.870529 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:27 crc kubenswrapper[4774]: E1001 13:39:27.870667 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:27 crc kubenswrapper[4774]: I1001 13:39:27.871706 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:27 crc kubenswrapper[4774]: I1001 13:39:27.871947 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:39:27 crc kubenswrapper[4774]: E1001 13:39:27.871963 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.635818 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/3.log" Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.641123 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerStarted","Data":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.641774 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.692115 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podStartSLOduration=117.69208923 podStartE2EDuration="1m57.69208923s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:28.690530437 +0000 UTC m=+140.580161144" watchObservedRunningTime="2025-10-01 13:39:28.69208923 +0000 UTC m=+140.581719857" Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.870425 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:28 crc kubenswrapper[4774]: E1001 13:39:28.872089 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.914184 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgfsz"] Oct 01 13:39:28 crc kubenswrapper[4774]: I1001 13:39:28.914371 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:28 crc kubenswrapper[4774]: E1001 13:39:28.914556 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:29 crc kubenswrapper[4774]: E1001 13:39:29.000324 4774 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:39:29 crc kubenswrapper[4774]: I1001 13:39:29.869908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:29 crc kubenswrapper[4774]: E1001 13:39:29.870072 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:29 crc kubenswrapper[4774]: I1001 13:39:29.870609 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:29 crc kubenswrapper[4774]: E1001 13:39:29.870727 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:30 crc kubenswrapper[4774]: I1001 13:39:30.869677 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:30 crc kubenswrapper[4774]: I1001 13:39:30.869807 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:30 crc kubenswrapper[4774]: E1001 13:39:30.869912 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:30 crc kubenswrapper[4774]: E1001 13:39:30.870097 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:31 crc kubenswrapper[4774]: I1001 13:39:31.870543 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:31 crc kubenswrapper[4774]: I1001 13:39:31.870622 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:31 crc kubenswrapper[4774]: E1001 13:39:31.870741 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:31 crc kubenswrapper[4774]: E1001 13:39:31.870914 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:32 crc kubenswrapper[4774]: I1001 13:39:32.870422 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:32 crc kubenswrapper[4774]: I1001 13:39:32.870509 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:32 crc kubenswrapper[4774]: E1001 13:39:32.870984 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:39:32 crc kubenswrapper[4774]: E1001 13:39:32.871173 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hgfsz" podUID="67555194-dc73-4f0a-bd6e-1ae0a010067a" Oct 01 13:39:33 crc kubenswrapper[4774]: I1001 13:39:33.870315 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:33 crc kubenswrapper[4774]: I1001 13:39:33.870342 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:33 crc kubenswrapper[4774]: E1001 13:39:33.870599 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:39:33 crc kubenswrapper[4774]: E1001 13:39:33.870781 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.869830 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.869844 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.874361 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.875606 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.875825 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 13:39:34 crc kubenswrapper[4774]: I1001 13:39:34.875840 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 13:39:35 crc kubenswrapper[4774]: I1001 13:39:35.869896 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:35 crc kubenswrapper[4774]: I1001 13:39:35.869958 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:35 crc kubenswrapper[4774]: I1001 13:39:35.873106 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 13:39:35 crc kubenswrapper[4774]: I1001 13:39:35.874101 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.271592 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.271681 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.890756 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.890971 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.891076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.891138 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.891196 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:37 crc kubenswrapper[4774]: E1001 13:39:37.891711 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:41:39.891663563 +0000 UTC m=+271.781294200 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.892843 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.900098 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.900409 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.900590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:37 crc kubenswrapper[4774]: I1001 13:39:37.917245 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.000026 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.013367 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:38 crc kubenswrapper[4774]: W1001 13:39:38.223777 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-1225f7b0fa6b61d4bddf0c948d02e6fa8fe72377f6e84d8bde6bc6b6e66c4e86 WatchSource:0}: Error finding container 1225f7b0fa6b61d4bddf0c948d02e6fa8fe72377f6e84d8bde6bc6b6e66c4e86: Status 404 returned error can't find the container with id 1225f7b0fa6b61d4bddf0c948d02e6fa8fe72377f6e84d8bde6bc6b6e66c4e86 Oct 01 13:39:38 crc kubenswrapper[4774]: W1001 13:39:38.274941 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-c11cf4556c76cca8bd4d30ec1834cfe94624b82a4afb558691c37c6f506acd3e WatchSource:0}: Error finding container c11cf4556c76cca8bd4d30ec1834cfe94624b82a4afb558691c37c6f506acd3e: Status 404 returned error can't find the container with id c11cf4556c76cca8bd4d30ec1834cfe94624b82a4afb558691c37c6f506acd3e Oct 01 13:39:38 crc kubenswrapper[4774]: W1001 13:39:38.460116 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-a78472457cd61fe401dc5013f62976c7ddf20125548df74edfd7d6b56d0cb9e5 WatchSource:0}: Error finding container a78472457cd61fe401dc5013f62976c7ddf20125548df74edfd7d6b56d0cb9e5: Status 404 returned error can't find the container with id a78472457cd61fe401dc5013f62976c7ddf20125548df74edfd7d6b56d0cb9e5 Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.504111 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.689875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8bc4981a7bed0e265b5a2d05c6bb08baf58d479c739bab12eff9574e691c69ff"} Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.689920 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c11cf4556c76cca8bd4d30ec1834cfe94624b82a4afb558691c37c6f506acd3e"} Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.690203 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.692211 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4f313bf2d00fb52c590d67dbbfa5b927d1b5a2d915ba93309f9918bce37c2466"} Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.692266 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a78472457cd61fe401dc5013f62976c7ddf20125548df74edfd7d6b56d0cb9e5"} Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.694124 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"09a15f786cb2e8f0ad716340121584f59eb2f56baf719dba615545401a233724"} Oct 01 13:39:38 crc kubenswrapper[4774]: I1001 13:39:38.694184 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"1225f7b0fa6b61d4bddf0c948d02e6fa8fe72377f6e84d8bde6bc6b6e66c4e86"} Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.285737 4774 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.335325 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nndfg"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.336338 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.337810 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.338155 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.338478 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.338511 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7crv"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.339094 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.339123 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.341857 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.342193 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.342429 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.342655 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.342849 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.343583 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.349483 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.350082 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.352967 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.353182 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.353408 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.369967 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.370098 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.370252 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.370502 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.370602 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.370688 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.370951 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.370968 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.370993 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.371002 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.371049 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.371066 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.371178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.371315 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.371346 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.371394 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.371411 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.371513 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.371533 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.370674 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.373003 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.373402 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.373582 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.375139 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.375375 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.375478 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.380514 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.380570 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.380538 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.380735 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.380860 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: W1001 13:39:40.380934 4774 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Oct 01 13:39:40 crc kubenswrapper[4774]: E1001 13:39:40.380973 4774 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.381024 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.381027 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.381109 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.381169 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.382331 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jzv9l"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.382863 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.383584 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.384394 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.385106 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.386987 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.387400 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nndfg"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.387508 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.391309 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.392144 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.392245 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9f8ft"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.392918 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393569 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393779 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393911 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393991 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394064 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393998 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393918 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.393953 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394105 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394225 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394318 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394323 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394356 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394723 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394793 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.394846 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.397665 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjxqs"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.398150 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.398610 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.399094 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.400757 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.400934 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.401701 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.401748 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.405511 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.405756 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.405902 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.406005 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.406066 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.406125 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sfqhx"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.406656 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.414226 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.415036 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.419511 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.420294 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421086 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421118 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qs6\" (UniqueName: \"kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421149 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zghhr\" (UniqueName: \"kubernetes.io/projected/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-kube-api-access-zghhr\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421171 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421197 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit-dir\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421219 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx67f\" (UniqueName: \"kubernetes.io/projected/b69e1571-8ffe-4425-917c-bb7021c3c74b-kube-api-access-nx67f\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421241 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421262 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421296 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421317 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421340 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421359 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzpg\" (UniqueName: \"kubernetes.io/projected/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-kube-api-access-wgzpg\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421390 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421410 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-images\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421432 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421471 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421491 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-serving-cert\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421513 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421537 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421560 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w59xr\" (UniqueName: \"kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421583 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-image-import-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421625 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421647 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-dir\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421667 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421686 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421707 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-config\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421727 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-node-pullsecrets\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421747 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-encryption-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421768 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421788 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-client\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.421808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.425480 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.425913 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.431728 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.432694 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.433062 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.433359 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.446519 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.447588 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8gh62"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.447923 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.448298 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.448433 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.448612 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.452961 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.466418 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.469568 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.470414 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.470638 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.471183 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.471669 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.471956 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.478573 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.478858 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.480686 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.482375 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.484536 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xsqjd"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.485952 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.502039 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.502187 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.502641 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.503137 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.503706 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.503961 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.504104 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.504745 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.504835 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.504941 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.505041 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.505106 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.506673 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-r7bcf"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.506724 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.506942 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.507157 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.507218 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.507377 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.507688 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.507908 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508107 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508157 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508218 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508342 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508488 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508502 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508246 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508620 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508736 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508795 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.508942 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509016 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509132 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509293 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509414 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509438 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509184 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509547 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509565 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509680 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509716 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509778 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.509808 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.510283 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.510671 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j4nkp"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.511294 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.512180 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.512589 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.513500 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.513890 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.515034 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.515595 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.515843 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.516287 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.522973 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7539887f-d1c5-417e-aaf3-669de74c241d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523018 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-webhook-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523051 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523077 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw7pq\" (UniqueName: \"kubernetes.io/projected/1c4215dd-b2d9-4617-9bc4-43536f0a06f6-kube-api-access-bw7pq\") pod \"downloads-7954f5f757-9f8ft\" (UID: \"1c4215dd-b2d9-4617-9bc4-43536f0a06f6\") " pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523102 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523164 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7540b9-c6a0-41db-a094-486631000bdd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523187 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ae865cc-0785-4017-9e04-be7d244b0493-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523250 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523324 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523349 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzpg\" (UniqueName: \"kubernetes.io/projected/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-kube-api-access-wgzpg\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523389 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523411 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-images\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523436 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7tdn\" (UniqueName: \"kubernetes.io/projected/da7540b9-c6a0-41db-a094-486631000bdd-kube-api-access-x7tdn\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523492 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmwps\" (UniqueName: \"kubernetes.io/projected/26f66ad2-cd4c-4352-a060-f115420788ab-kube-api-access-bmwps\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523516 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljfn\" (UniqueName: \"kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523538 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523581 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ff0b1ac-8f04-4329-a5c7-cef871a84890-machine-approver-tls\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2r6\" (UniqueName: \"kubernetes.io/projected/91e70912-55cd-44d4-be6f-b6c637bec430-kube-api-access-jj2r6\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523651 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523743 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-srv-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523768 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-serving-cert\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523789 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523837 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w59xr\" (UniqueName: \"kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523862 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523932 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-config\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.523957 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524011 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bw8\" (UniqueName: \"kubernetes.io/projected/3ff0b1ac-8f04-4329-a5c7-cef871a84890-kube-api-access-j2bw8\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524032 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-config\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524086 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524111 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524169 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-image-import-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524208 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b19b5d90-1417-47fb-9c96-8558739656dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524284 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.524304 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-service-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526604 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-auth-proxy-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526659 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkzg\" (UniqueName: \"kubernetes.io/projected/8b019ae2-a243-4cd5-bc3b-b0428c74df07-kube-api-access-pvkzg\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526701 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-serving-cert\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526730 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e4912-6802-4b79-b7ed-aec44a875cfb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526765 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19b5d90-1417-47fb-9c96-8558739656dc-config\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526786 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526808 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-etcd-client\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-dir\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526886 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526902 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19b5d90-1417-47fb-9c96-8558739656dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526918 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526939 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526968 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.526984 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmrxw\" (UniqueName: \"kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527004 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91e70912-55cd-44d4-be6f-b6c637bec430-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527024 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-config\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527046 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae865cc-0785-4017-9e04-be7d244b0493-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527131 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527148 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527167 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-node-pullsecrets\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527182 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-encryption-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527216 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527233 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7539887f-d1c5-417e-aaf3-669de74c241d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527251 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-client\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527267 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce42be67-2e4d-4ca7-8ed8-5173d003c548-serving-cert\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527284 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527302 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527319 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527386 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrtr\" (UniqueName: \"kubernetes.io/projected/d7b8192d-4515-4ac3-a253-b245bb57c64e-kube-api-access-mxrtr\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527409 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527438 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zjd\" (UniqueName: \"kubernetes.io/projected/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-kube-api-access-d4zjd\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527489 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgjz\" (UniqueName: \"kubernetes.io/projected/b719c2e8-d04c-4b7e-998c-643f5b166d13-kube-api-access-wvgjz\") pod \"migrator-59844c95c7-hcsr7\" (UID: \"b719c2e8-d04c-4b7e-998c-643f5b166d13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527509 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26f66ad2-cd4c-4352-a060-f115420788ab-tmpfs\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527534 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qs6\" (UniqueName: \"kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527550 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0e4912-6802-4b79-b7ed-aec44a875cfb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527567 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527586 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zghhr\" (UniqueName: \"kubernetes.io/projected/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-kube-api-access-zghhr\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527601 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527616 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527633 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfw9\" (UniqueName: \"kubernetes.io/projected/80b19e22-85f2-482d-b4f9-525df7772776-kube-api-access-fwfw9\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527649 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znxsr\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-kube-api-access-znxsr\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527664 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zhgs\" (UniqueName: \"kubernetes.io/projected/ce42be67-2e4d-4ca7-8ed8-5173d003c548-kube-api-access-5zhgs\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527681 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.527809 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-dir\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.534046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.534296 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.534809 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.535477 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.536800 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.540413 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.540706 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.540954 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-service-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541005 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit-dir\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx67f\" (UniqueName: \"kubernetes.io/projected/b69e1571-8ffe-4425-917c-bb7021c3c74b-kube-api-access-nx67f\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541087 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541115 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541143 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541197 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541222 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtbr\" (UniqueName: \"kubernetes.io/projected/ed0e4912-6802-4b79-b7ed-aec44a875cfb-kube-api-access-pvtbr\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541246 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btkk\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-kube-api-access-5btkk\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541269 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541297 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7540b9-c6a0-41db-a094-486631000bdd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541363 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541389 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b019ae2-a243-4cd5-bc3b-b0428c74df07-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541416 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kh74\" (UniqueName: \"kubernetes.io/projected/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-kube-api-access-8kh74\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541531 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit-dir\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541687 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-node-pullsecrets\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.541778 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zmlhc"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.542299 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.542965 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.543417 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-audit\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544034 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-images\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544472 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544748 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.544860 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-config\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.545883 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.546230 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.546285 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7crv"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.546822 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.547485 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-image-import-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.547890 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.547931 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.548149 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.571507 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-serving-ca\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.571658 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.571796 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgdm8"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.573435 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.573573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-serving-cert\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.580735 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.581012 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.583838 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-etcd-client\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.583969 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.585153 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.585198 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjxqs"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.585276 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.585327 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kk6dj"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.586591 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-encryption-config\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.587859 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.587940 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.591226 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.594629 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.596276 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.597542 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sfqhx"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.598955 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.599916 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.601005 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jzv9l"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.602003 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.603363 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.604621 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.605667 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.606950 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.607936 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.609397 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.610437 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r7bcf"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.611828 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.612949 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.614102 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xsqjd"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.615108 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgdm8"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.617216 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.618154 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.618648 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.620402 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jr7t5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.621567 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.621990 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lsczd"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.622490 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.625801 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.628685 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.631002 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9f8ft"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.632564 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zmlhc"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.633959 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kk6dj"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.635471 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.636639 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.637791 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j4nkp"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.638242 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.639155 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.640518 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.641574 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.641919 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.641949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-config\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.641969 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-config\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.641988 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642006 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bw8\" (UniqueName: \"kubernetes.io/projected/3ff0b1ac-8f04-4329-a5c7-cef871a84890-kube-api-access-j2bw8\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642043 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b19b5d90-1417-47fb-9c96-8558739656dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642064 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-service-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642079 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642094 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-auth-proxy-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642109 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkzg\" (UniqueName: \"kubernetes.io/projected/8b019ae2-a243-4cd5-bc3b-b0428c74df07-kube-api-access-pvkzg\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642125 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-serving-cert\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642144 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e4912-6802-4b79-b7ed-aec44a875cfb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642159 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-etcd-client\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642175 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642191 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19b5d90-1417-47fb-9c96-8558739656dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642204 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19b5d90-1417-47fb-9c96-8558739656dc-config\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642222 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642239 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmrxw\" (UniqueName: \"kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642254 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91e70912-55cd-44d4-be6f-b6c637bec430-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642271 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642286 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae865cc-0785-4017-9e04-be7d244b0493-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642300 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642315 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642330 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642345 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7539887f-d1c5-417e-aaf3-669de74c241d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642360 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce42be67-2e4d-4ca7-8ed8-5173d003c548-serving-cert\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642377 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642393 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642410 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642432 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrtr\" (UniqueName: \"kubernetes.io/projected/d7b8192d-4515-4ac3-a253-b245bb57c64e-kube-api-access-mxrtr\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642462 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zjd\" (UniqueName: \"kubernetes.io/projected/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-kube-api-access-d4zjd\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642477 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642493 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgjz\" (UniqueName: \"kubernetes.io/projected/b719c2e8-d04c-4b7e-998c-643f5b166d13-kube-api-access-wvgjz\") pod \"migrator-59844c95c7-hcsr7\" (UID: \"b719c2e8-d04c-4b7e-998c-643f5b166d13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642509 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26f66ad2-cd4c-4352-a060-f115420788ab-tmpfs\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642536 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-config\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642559 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0e4912-6802-4b79-b7ed-aec44a875cfb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642597 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642619 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfw9\" (UniqueName: \"kubernetes.io/projected/80b19e22-85f2-482d-b4f9-525df7772776-kube-api-access-fwfw9\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znxsr\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-kube-api-access-znxsr\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zhgs\" (UniqueName: \"kubernetes.io/projected/ce42be67-2e4d-4ca7-8ed8-5173d003c548-kube-api-access-5zhgs\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642690 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642707 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-service-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642734 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtbr\" (UniqueName: \"kubernetes.io/projected/ed0e4912-6802-4b79-b7ed-aec44a875cfb-kube-api-access-pvtbr\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642777 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642792 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642808 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7540b9-c6a0-41db-a094-486631000bdd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642838 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5btkk\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-kube-api-access-5btkk\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642855 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642873 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b019ae2-a243-4cd5-bc3b-b0428c74df07-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642892 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kh74\" (UniqueName: \"kubernetes.io/projected/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-kube-api-access-8kh74\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7539887f-d1c5-417e-aaf3-669de74c241d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642925 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-webhook-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw7pq\" (UniqueName: \"kubernetes.io/projected/1c4215dd-b2d9-4617-9bc4-43536f0a06f6-kube-api-access-bw7pq\") pod \"downloads-7954f5f757-9f8ft\" (UID: \"1c4215dd-b2d9-4617-9bc4-43536f0a06f6\") " pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642962 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642978 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7540b9-c6a0-41db-a094-486631000bdd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642994 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ae865cc-0785-4017-9e04-be7d244b0493-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643010 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643027 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643050 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7tdn\" (UniqueName: \"kubernetes.io/projected/da7540b9-c6a0-41db-a094-486631000bdd-kube-api-access-x7tdn\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643070 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmwps\" (UniqueName: \"kubernetes.io/projected/26f66ad2-cd4c-4352-a060-f115420788ab-kube-api-access-bmwps\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643088 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljfn\" (UniqueName: \"kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643105 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2r6\" (UniqueName: \"kubernetes.io/projected/91e70912-55cd-44d4-be6f-b6c637bec430-kube-api-access-jj2r6\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643122 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-srv-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643138 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ff0b1ac-8f04-4329-a5c7-cef871a84890-machine-approver-tls\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643190 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0e4912-6802-4b79-b7ed-aec44a875cfb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643365 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-service-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.643434 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.644124 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.644165 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-auth-proxy-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.645068 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-service-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.645099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce42be67-2e4d-4ca7-8ed8-5173d003c548-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.645316 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.645560 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ff0b1ac-8f04-4329-a5c7-cef871a84890-config\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.646236 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.646341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-serving-cert\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.646502 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7539887f-d1c5-417e-aaf3-669de74c241d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.646924 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-etcd-ca\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.642808 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jr7t5"] Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.647897 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.648671 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/26f66ad2-cd4c-4352-a060-f115420788ab-tmpfs\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.648790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80b19e22-85f2-482d-b4f9-525df7772776-config\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649034 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649190 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da7540b9-c6a0-41db-a094-486631000bdd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649320 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649381 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19b5d90-1417-47fb-9c96-8558739656dc-config\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649644 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da7540b9-c6a0-41db-a094-486631000bdd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.649932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3ff0b1ac-8f04-4329-a5c7-cef871a84890-machine-approver-tls\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.650389 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.650436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7539887f-d1c5-417e-aaf3-669de74c241d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.651031 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19b5d90-1417-47fb-9c96-8558739656dc-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.651439 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.652181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce42be67-2e4d-4ca7-8ed8-5173d003c548-serving-cert\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.652190 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.652315 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/91e70912-55cd-44d4-be6f-b6c637bec430-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.652658 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.652915 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.653015 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed0e4912-6802-4b79-b7ed-aec44a875cfb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.653479 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80b19e22-85f2-482d-b4f9-525df7772776-etcd-client\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.653572 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.654274 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.655326 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.658788 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.678846 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.699077 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.705161 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-serving-cert\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.718416 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.738715 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.763716 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.778565 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.798876 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.818646 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.839129 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.858533 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.873485 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-srv-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.878550 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.892786 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7b8192d-4515-4ac3-a253-b245bb57c64e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.899239 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.919688 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.939534 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.959537 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.979514 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 13:39:40 crc kubenswrapper[4774]: I1001 13:39:40.999057 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.019183 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.039886 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.052860 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b019ae2-a243-4cd5-bc3b-b0428c74df07-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.058874 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.070174 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-webhook-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.073368 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26f66ad2-cd4c-4352-a060-f115420788ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.079436 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.089445 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.099442 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.128124 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.139900 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.140173 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.160407 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.180577 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.199844 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.218922 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.232766 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ae865cc-0785-4017-9e04-be7d244b0493-metrics-tls\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.247718 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.255426 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ae865cc-0785-4017-9e04-be7d244b0493-trusted-ca\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.258120 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.279886 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.299258 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.319212 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.339802 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.361035 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.379534 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.399252 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.419541 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.439763 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.459304 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.478819 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.499143 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.517638 4774 request.go:700] Waited for 1.013054786s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/configmaps?fieldSelector=metadata.name%3Dservice-ca-operator-config&limit=500&resourceVersion=0 Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.519439 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.524689 4774 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.524806 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.024768532 +0000 UTC m=+153.914399169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.525002 4774 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.525152 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.025113392 +0000 UTC m=+153.914744029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.526000 4774 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.526080 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.026058788 +0000 UTC m=+153.915689425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.528530 4774 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.528621 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.028596549 +0000 UTC m=+153.918227276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.539505 4774 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.539646 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.039623567 +0000 UTC m=+153.929254194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.541806 4774 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: E1001 13:39:41.541924 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:42.041904841 +0000 UTC m=+153.931535468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync secret cache: timed out waiting for the condition Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.559217 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.580226 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.599954 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.619420 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.650499 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.659110 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.679808 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.718934 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.740237 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.759000 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.781422 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.799721 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.819261 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.839086 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.859472 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.879375 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.899024 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.919875 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.939273 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.960238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.979178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 13:39:41 crc kubenswrapper[4774]: I1001 13:39:41.999144 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.049676 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzpg\" (UniqueName: \"kubernetes.io/projected/c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3-kube-api-access-wgzpg\") pod \"apiserver-76f77b778f-nndfg\" (UID: \"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3\") " pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.073815 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w59xr\" (UniqueName: \"kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr\") pod \"controller-manager-879f6c89f-wpmxq\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.079997 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.080153 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.080297 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.080417 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.080692 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.080847 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.090175 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qs6\" (UniqueName: \"kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6\") pod \"route-controller-manager-6576b87f9c-ntw6z\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.108486 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zghhr\" (UniqueName: \"kubernetes.io/projected/22f1d9a2-5cf9-43f9-bd4e-822382f55a7c-kube-api-access-zghhr\") pod \"machine-api-operator-5694c8668f-r7crv\" (UID: \"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.119490 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.139673 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.160028 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.179651 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.193407 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.204691 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.209224 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.219698 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.221442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.231976 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.240639 4774 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.261163 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.279202 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.298950 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.322354 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.339528 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.360739 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.378428 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.399155 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.419414 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.432841 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7crv"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.439422 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.457837 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.463345 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 13:39:42 crc kubenswrapper[4774]: W1001 13:39:42.466774 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2abdae49_e923_4ba8_92f8_376d7cde1af2.slice/crio-0099722f7b32fafffbd79588b9686c7213fb6f07003d089c6ca329624e0cc054 WatchSource:0}: Error finding container 0099722f7b32fafffbd79588b9686c7213fb6f07003d089c6ca329624e0cc054: Status 404 returned error can't find the container with id 0099722f7b32fafffbd79588b9686c7213fb6f07003d089c6ca329624e0cc054 Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.478242 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-nndfg"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.479303 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 13:39:42 crc kubenswrapper[4774]: W1001 13:39:42.489705 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ff7b1d_7ac6_4fa8_8003_b2f68a12b2b3.slice/crio-a223d4efb09856dd577203376e42d04dbb081548a4a1b1c6ed753f3852141785 WatchSource:0}: Error finding container a223d4efb09856dd577203376e42d04dbb081548a4a1b1c6ed753f3852141785: Status 404 returned error can't find the container with id a223d4efb09856dd577203376e42d04dbb081548a4a1b1c6ed753f3852141785 Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.517655 4774 request.go:700] Waited for 1.87387372s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.521512 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b19b5d90-1417-47fb-9c96-8558739656dc-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kgj6n\" (UID: \"b19b5d90-1417-47fb-9c96-8558739656dc\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.537016 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.548159 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.551240 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.585203 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw7pq\" (UniqueName: \"kubernetes.io/projected/1c4215dd-b2d9-4617-9bc4-43536f0a06f6-kube-api-access-bw7pq\") pod \"downloads-7954f5f757-9f8ft\" (UID: \"1c4215dd-b2d9-4617-9bc4-43536f0a06f6\") " pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.594761 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkzg\" (UniqueName: \"kubernetes.io/projected/8b019ae2-a243-4cd5-bc3b-b0428c74df07-kube-api-access-pvkzg\") pod \"package-server-manager-789f6589d5-gd5zm\" (UID: \"8b019ae2-a243-4cd5-bc3b-b0428c74df07\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.611725 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfw9\" (UniqueName: \"kubernetes.io/projected/80b19e22-85f2-482d-b4f9-525df7772776-kube-api-access-fwfw9\") pod \"etcd-operator-b45778765-kjxqs\" (UID: \"80b19e22-85f2-482d-b4f9-525df7772776\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.634938 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljfn\" (UniqueName: \"kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn\") pod \"marketplace-operator-79b997595-kxbs7\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.655505 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7tdn\" (UniqueName: \"kubernetes.io/projected/da7540b9-c6a0-41db-a094-486631000bdd-kube-api-access-x7tdn\") pod \"openshift-controller-manager-operator-756b6f6bc6-kjwcn\" (UID: \"da7540b9-c6a0-41db-a094-486631000bdd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.674686 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmwps\" (UniqueName: \"kubernetes.io/projected/26f66ad2-cd4c-4352-a060-f115420788ab-kube-api-access-bmwps\") pod \"packageserver-d55dfcdfc-4fcb5\" (UID: \"26f66ad2-cd4c-4352-a060-f115420788ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.675171 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:39:42 crc kubenswrapper[4774]: W1001 13:39:42.683659 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod212cd75f_356e_4ed5_a82a_98617024f18c.slice/crio-59a7c0f64dc1c7abd139659682a2ab70dc174f6221c6fb61bfaadcd45498e77d WatchSource:0}: Error finding container 59a7c0f64dc1c7abd139659682a2ab70dc174f6221c6fb61bfaadcd45498e77d: Status 404 returned error can't find the container with id 59a7c0f64dc1c7abd139659682a2ab70dc174f6221c6fb61bfaadcd45498e77d Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.689936 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.692643 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2r6\" (UniqueName: \"kubernetes.io/projected/91e70912-55cd-44d4-be6f-b6c637bec430-kube-api-access-jj2r6\") pod \"control-plane-machine-set-operator-78cbb6b69f-zwfjn\" (UID: \"91e70912-55cd-44d4-be6f-b6c637bec430\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.698253 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.703918 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.708741 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" event={"ID":"212cd75f-356e-4ed5-a82a-98617024f18c","Type":"ContainerStarted","Data":"59a7c0f64dc1c7abd139659682a2ab70dc174f6221c6fb61bfaadcd45498e77d"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.710523 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" event={"ID":"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c","Type":"ContainerStarted","Data":"b1f52fb62f949141fa9b54e1adc0ec540cbfb2dcebf56038a8b6a169752f94c5"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.710559 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" event={"ID":"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c","Type":"ContainerStarted","Data":"9fa0a6b279775ccb9fdb3feef0ecac88b18ec4b4999a34e12b52ab8f0f47f740"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.710571 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" event={"ID":"22f1d9a2-5cf9-43f9-bd4e-822382f55a7c","Type":"ContainerStarted","Data":"2103913674b0d193015d357091f0a2701a82a448f4ce0e5836f651a6bb582d28"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.710991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtbr\" (UniqueName: \"kubernetes.io/projected/ed0e4912-6802-4b79-b7ed-aec44a875cfb-kube-api-access-pvtbr\") pod \"openshift-apiserver-operator-796bbdcf4f-6z77g\" (UID: \"ed0e4912-6802-4b79-b7ed-aec44a875cfb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.712204 4774 generic.go:334] "Generic (PLEG): container finished" podID="c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3" containerID="42c20e2f9c4c410ad0824435b06a8e786fdc2b5e6caffdff364856cecd9fb042" exitCode=0 Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.712259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" event={"ID":"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3","Type":"ContainerDied","Data":"42c20e2f9c4c410ad0824435b06a8e786fdc2b5e6caffdff364856cecd9fb042"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.712287 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" event={"ID":"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3","Type":"ContainerStarted","Data":"a223d4efb09856dd577203376e42d04dbb081548a4a1b1c6ed753f3852141785"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.713564 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" event={"ID":"2abdae49-e923-4ba8-92f8-376d7cde1af2","Type":"ContainerStarted","Data":"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.713597 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" event={"ID":"2abdae49-e923-4ba8-92f8-376d7cde1af2","Type":"ContainerStarted","Data":"0099722f7b32fafffbd79588b9686c7213fb6f07003d089c6ca329624e0cc054"} Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.714383 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.715343 4774 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ntw6z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.715379 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.716942 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.732726 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znxsr\" (UniqueName: \"kubernetes.io/projected/7539887f-d1c5-417e-aaf3-669de74c241d-kube-api-access-znxsr\") pod \"cluster-image-registry-operator-dc59b4c8b-wzdb5\" (UID: \"7539887f-d1c5-417e-aaf3-669de74c241d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.739357 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.755268 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zhgs\" (UniqueName: \"kubernetes.io/projected/ce42be67-2e4d-4ca7-8ed8-5173d003c548-kube-api-access-5zhgs\") pod \"authentication-operator-69f744f599-jzv9l\" (UID: \"ce42be67-2e4d-4ca7-8ed8-5173d003c548\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.775590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btkk\" (UniqueName: \"kubernetes.io/projected/1ae865cc-0785-4017-9e04-be7d244b0493-kube-api-access-5btkk\") pod \"ingress-operator-5b745b69d9-g4z9v\" (UID: \"1ae865cc-0785-4017-9e04-be7d244b0493\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.782054 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.785825 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.792111 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.798213 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmrxw\" (UniqueName: \"kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw\") pod \"oauth-openshift-558db77b4-chw44\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.798691 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.814080 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kh74\" (UniqueName: \"kubernetes.io/projected/36ad3550-2755-4d27-8cfb-11b0c82f1bb0-kube-api-access-8kh74\") pod \"kube-storage-version-migrator-operator-b67b599dd-6dcpv\" (UID: \"36ad3550-2755-4d27-8cfb-11b0c82f1bb0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.835529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrtr\" (UniqueName: \"kubernetes.io/projected/d7b8192d-4515-4ac3-a253-b245bb57c64e-kube-api-access-mxrtr\") pod \"olm-operator-6b444d44fb-9ljx9\" (UID: \"d7b8192d-4515-4ac3-a253-b245bb57c64e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.863323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bw8\" (UniqueName: \"kubernetes.io/projected/3ff0b1ac-8f04-4329-a5c7-cef871a84890-kube-api-access-j2bw8\") pod \"machine-approver-56656f9798-ddtd2\" (UID: \"3ff0b1ac-8f04-4329-a5c7-cef871a84890\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.882494 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgjz\" (UniqueName: \"kubernetes.io/projected/b719c2e8-d04c-4b7e-998c-643f5b166d13-kube-api-access-wvgjz\") pod \"migrator-59844c95c7-hcsr7\" (UID: \"b719c2e8-d04c-4b7e-998c-643f5b166d13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.884123 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9f8ft"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.885951 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.895565 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zjd\" (UniqueName: \"kubernetes.io/projected/a0c0ab2f-639f-4892-80a8-d0ed090e6d5f-kube-api-access-d4zjd\") pod \"openshift-config-operator-7777fb866f-xfhs8\" (UID: \"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.898815 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.907691 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-client\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.909706 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.919362 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.921777 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.934305 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.939677 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.950878 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx67f\" (UniqueName: \"kubernetes.io/projected/b69e1571-8ffe-4425-917c-bb7021c3c74b-kube-api-access-nx67f\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.963523 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.978909 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.986658 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-serving-cert\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.986972 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.993256 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kjxqs"] Oct 01 13:39:42 crc kubenswrapper[4774]: I1001 13:39:42.999125 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.008469 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b69e1571-8ffe-4425-917c-bb7021c3c74b-encryption-config\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.018850 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.038598 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.045292 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.048667 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.054494 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.057180 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.061782 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.068001 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.081702 4774 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.081778 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.081759719 +0000 UTC m=+155.971390316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.081791 4774 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.081988 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies podName:b69e1571-8ffe-4425-917c-bb7021c3c74b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.081960175 +0000 UTC m=+155.971590842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies") pod "apiserver-7bbb656c7d-vnt9h" (UID: "b69e1571-8ffe-4425-917c-bb7021c3c74b") : failed to sync configmap cache: timed out waiting for the condition Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.083687 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvj64\" (UniqueName: \"kubernetes.io/projected/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-kube-api-access-rvj64\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093470 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093500 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-metrics-tls\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093552 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093568 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92ggd\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-metrics-certs\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093608 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-service-ca-bundle\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093629 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093645 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093663 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093694 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093713 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdd6\" (UniqueName: \"kubernetes.io/projected/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-kube-api-access-6wdd6\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093743 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-stats-auth\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093761 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxcx\" (UniqueName: \"kubernetes.io/projected/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-kube-api-access-xtxcx\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093805 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-default-certificate\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.093825 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.094122 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:43.594111064 +0000 UTC m=+155.483741661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.095153 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.109605 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.121479 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5"] Oct 01 13:39:43 crc kubenswrapper[4774]: W1001 13:39:43.148749 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26f66ad2_cd4c_4352_a060_f115420788ab.slice/crio-8b94ce64fa53a71eb371ea2fdc356acc601cc3861d1f97e95ef0d22397029eee WatchSource:0}: Error finding container 8b94ce64fa53a71eb371ea2fdc356acc601cc3861d1f97e95ef0d22397029eee: Status 404 returned error can't find the container with id 8b94ce64fa53a71eb371ea2fdc356acc601cc3861d1f97e95ef0d22397029eee Oct 01 13:39:43 crc kubenswrapper[4774]: W1001 13:39:43.149764 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e70912_55cd_44d4_be6f_b6c637bec430.slice/crio-4ce7f2bac482042e1321ece899e67ae2aa7dc1762d7b8846cec3863e7da31fe3 WatchSource:0}: Error finding container 4ce7f2bac482042e1321ece899e67ae2aa7dc1762d7b8846cec3863e7da31fe3: Status 404 returned error can't find the container with id 4ce7f2bac482042e1321ece899e67ae2aa7dc1762d7b8846cec3863e7da31fe3 Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.196163 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.196331 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.196352 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.196368 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92ggd\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.196388 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-service-ca\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.196466 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:43.696420233 +0000 UTC m=+155.586050830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197207 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7313744-98b0-4a56-bd19-280760be765f-config\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197547 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mh7\" (UniqueName: \"kubernetes.io/projected/2539b1bd-ceb0-4918-a20c-56775bc1bb17-kube-api-access-74mh7\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197575 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qts76\" (UniqueName: \"kubernetes.io/projected/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-kube-api-access-qts76\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197601 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-certs\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197783 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8fhh\" (UniqueName: \"kubernetes.io/projected/2aeefd1c-f8aa-483d-bf3e-424600e9557e-kube-api-access-r8fhh\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-trusted-ca\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197825 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e558dddd-0d3c-46a2-aa57-33227ff3054d-config\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197879 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-metrics-certs\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197898 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58pf\" (UniqueName: \"kubernetes.io/projected/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-kube-api-access-t58pf\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fcafec0-e631-4583-b508-14b1bc9be3b6-cert\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.197939 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-service-ca-bundle\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198215 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6078b99-d5d2-48ce-89c7-163eca80ff85-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198235 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198272 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2539b1bd-ceb0-4918-a20c-56775bc1bb17-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198364 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-socket-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198381 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-profile-collector-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198406 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198420 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9709232-51d8-4109-8153-43c567908267-serving-cert\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198470 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198487 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198505 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbtx\" (UniqueName: \"kubernetes.io/projected/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-kube-api-access-4pbtx\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198520 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-metrics-tls\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198561 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtbqf\" (UniqueName: \"kubernetes.io/projected/9fcafec0-e631-4583-b508-14b1bc9be3b6-kube-api-access-wtbqf\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198575 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphwm\" (UniqueName: \"kubernetes.io/projected/e558dddd-0d3c-46a2-aa57-33227ff3054d-kube-api-access-cphwm\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198590 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7313744-98b0-4a56-bd19-280760be765f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198645 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198663 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2lf\" (UniqueName: \"kubernetes.io/projected/e9709232-51d8-4109-8153-43c567908267-kube-api-access-zb2lf\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198697 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhxk\" (UniqueName: \"kubernetes.io/projected/00147e48-69ae-44af-8330-8cdf7618a470-kube-api-access-hkhxk\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198737 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdd6\" (UniqueName: \"kubernetes.io/projected/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-kube-api-access-6wdd6\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198811 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-csi-data-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198825 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7313744-98b0-4a56-bd19-280760be765f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198872 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6078b99-d5d2-48ce-89c7-163eca80ff85-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198887 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-oauth-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198902 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-images\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-proxy-tls\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198951 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-stats-auth\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198966 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxcx\" (UniqueName: \"kubernetes.io/projected/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-kube-api-access-xtxcx\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.198981 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-config-volume\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199010 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmsf\" (UniqueName: \"kubernetes.io/projected/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-kube-api-access-9gmsf\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199056 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199067 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199077 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199125 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-registration-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199192 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-trusted-ca-bundle\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199212 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-srv-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.199973 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-service-ca-bundle\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.207783 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.213618 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6078b99-d5d2-48ce-89c7-163eca80ff85-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.213749 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-key\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.213768 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00147e48-69ae-44af-8330-8cdf7618a470-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.213818 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:43.713796499 +0000 UTC m=+155.603427106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.214434 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxdt\" (UniqueName: \"kubernetes.io/projected/16564c2c-07e2-4d6e-9f35-fd14654d1538-kube-api-access-gxxdt\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.214694 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-oauth-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.214764 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np74x\" (UniqueName: \"kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.214783 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-node-bootstrap-token\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.215709 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-default-certificate\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.215822 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24db\" (UniqueName: \"kubernetes.io/projected/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-kube-api-access-v24db\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.215900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.215989 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00147e48-69ae-44af-8330-8cdf7618a470-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.216270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e558dddd-0d3c-46a2-aa57-33227ff3054d-serving-cert\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.216323 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvj64\" (UniqueName: \"kubernetes.io/projected/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-kube-api-access-rvj64\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.216344 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-plugins-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.217425 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-metrics-certs\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.217764 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-cabundle\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.217817 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.218654 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.218719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-config\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.218991 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.219106 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.219609 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-metrics-tls\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.219655 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-mountpoint-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.221098 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-stats-auth\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.221663 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92ggd\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.222647 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-default-certificate\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.228259 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-metrics-tls\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.228643 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.229924 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.260853 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxcx\" (UniqueName: \"kubernetes.io/projected/08c4a4cd-1564-42d3-a19f-0ef17b65d5be-kube-api-access-xtxcx\") pod \"dns-operator-744455d44c-sfqhx\" (UID: \"08c4a4cd-1564-42d3-a19f-0ef17b65d5be\") " pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.278179 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.310307 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdd6\" (UniqueName: \"kubernetes.io/projected/ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823-kube-api-access-6wdd6\") pod \"router-default-5444994796-8gh62\" (UID: \"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823\") " pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.316368 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvj64\" (UniqueName: \"kubernetes.io/projected/cd5a9b77-d066-401b-a7c2-0c331cc8dd2e-kube-api-access-rvj64\") pod \"cluster-samples-operator-665b6dd947-ldzzw\" (UID: \"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.320697 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.320911 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-plugins-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.320949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-cabundle\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.320970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.320992 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-config\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321028 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-mountpoint-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321050 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-service-ca\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321069 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7313744-98b0-4a56-bd19-280760be765f-config\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321093 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74mh7\" (UniqueName: \"kubernetes.io/projected/2539b1bd-ceb0-4918-a20c-56775bc1bb17-kube-api-access-74mh7\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321115 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qts76\" (UniqueName: \"kubernetes.io/projected/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-kube-api-access-qts76\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321136 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-certs\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321157 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8fhh\" (UniqueName: \"kubernetes.io/projected/2aeefd1c-f8aa-483d-bf3e-424600e9557e-kube-api-access-r8fhh\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321195 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-trusted-ca\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321217 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e558dddd-0d3c-46a2-aa57-33227ff3054d-config\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321238 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fcafec0-e631-4583-b508-14b1bc9be3b6-cert\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321256 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58pf\" (UniqueName: \"kubernetes.io/projected/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-kube-api-access-t58pf\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321277 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6078b99-d5d2-48ce-89c7-163eca80ff85-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321298 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321318 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2539b1bd-ceb0-4918-a20c-56775bc1bb17-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321341 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-profile-collector-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321361 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-socket-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321381 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9709232-51d8-4109-8153-43c567908267-serving-cert\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321402 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbtx\" (UniqueName: \"kubernetes.io/projected/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-kube-api-access-4pbtx\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321421 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321441 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-metrics-tls\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321482 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7313744-98b0-4a56-bd19-280760be765f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321508 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtbqf\" (UniqueName: \"kubernetes.io/projected/9fcafec0-e631-4583-b508-14b1bc9be3b6-kube-api-access-wtbqf\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321526 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cphwm\" (UniqueName: \"kubernetes.io/projected/e558dddd-0d3c-46a2-aa57-33227ff3054d-kube-api-access-cphwm\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321567 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2lf\" (UniqueName: \"kubernetes.io/projected/e9709232-51d8-4109-8153-43c567908267-kube-api-access-zb2lf\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321591 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhxk\" (UniqueName: \"kubernetes.io/projected/00147e48-69ae-44af-8330-8cdf7618a470-kube-api-access-hkhxk\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321614 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-csi-data-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321633 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7313744-98b0-4a56-bd19-280760be765f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321657 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6078b99-d5d2-48ce-89c7-163eca80ff85-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321674 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-oauth-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321692 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-images\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321711 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-proxy-tls\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-config-volume\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmsf\" (UniqueName: \"kubernetes.io/projected/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-kube-api-access-9gmsf\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321777 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321795 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321813 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-registration-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321834 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-trusted-ca-bundle\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321853 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-srv-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321875 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6078b99-d5d2-48ce-89c7-163eca80ff85-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321898 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-key\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321919 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00147e48-69ae-44af-8330-8cdf7618a470-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321938 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxdt\" (UniqueName: \"kubernetes.io/projected/16564c2c-07e2-4d6e-9f35-fd14654d1538-kube-api-access-gxxdt\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321956 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-oauth-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np74x\" (UniqueName: \"kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.321992 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-node-bootstrap-token\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.322023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24db\" (UniqueName: \"kubernetes.io/projected/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-kube-api-access-v24db\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.322047 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00147e48-69ae-44af-8330-8cdf7618a470-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.322070 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e558dddd-0d3c-46a2-aa57-33227ff3054d-serving-cert\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.322329 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:43.822313141 +0000 UTC m=+155.711943728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.322607 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-plugins-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.323776 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-cabundle\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.324173 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e558dddd-0d3c-46a2-aa57-33227ff3054d-config\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.326424 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/00147e48-69ae-44af-8330-8cdf7618a470-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.327028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.327095 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-registration-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.328813 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fcafec0-e631-4583-b508-14b1bc9be3b6-cert\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.328826 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-trusted-ca-bundle\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.334420 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.334736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-csi-data-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.327932 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e558dddd-0d3c-46a2-aa57-33227ff3054d-serving-cert\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.335124 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6078b99-d5d2-48ce-89c7-163eca80ff85-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.343060 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-srv-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.343586 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-images\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.343982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6078b99-d5d2-48ce-89c7-163eca80ff85-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.344446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-oauth-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.346722 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.346821 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.348383 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-config-volume\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.348704 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7313744-98b0-4a56-bd19-280760be765f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.348820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-socket-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.348846 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7313744-98b0-4a56-bd19-280760be765f-config\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.349244 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-trusted-ca\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.349558 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-mountpoint-dir\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.352223 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16564c2c-07e2-4d6e-9f35-fd14654d1538-service-ca\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.354199 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.355898 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-serving-cert\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.356029 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16564c2c-07e2-4d6e-9f35-fd14654d1538-console-oauth-config\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.356855 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-proxy-tls\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.356860 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-signing-key\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.359595 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2aeefd1c-f8aa-483d-bf3e-424600e9557e-profile-collector-cert\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.359934 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9709232-51d8-4109-8153-43c567908267-config\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.361399 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-metrics-tls\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.361939 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9709232-51d8-4109-8153-43c567908267-serving-cert\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.368393 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/00147e48-69ae-44af-8330-8cdf7618a470-proxy-tls\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.369843 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2539b1bd-ceb0-4918-a20c-56775bc1bb17-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.372539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-node-bootstrap-token\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.374348 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.384330 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f7313744-98b0-4a56-bd19-280760be765f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hz99\" (UID: \"f7313744-98b0-4a56-bd19-280760be765f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.385340 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-certs\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.387197 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58pf\" (UniqueName: \"kubernetes.io/projected/6737d9a0-288f-44bf-a67a-0a7cb37e89f9-kube-api-access-t58pf\") pod \"machine-config-server-lsczd\" (UID: \"6737d9a0-288f-44bf-a67a-0a7cb37e89f9\") " pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.396867 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtbqf\" (UniqueName: \"kubernetes.io/projected/9fcafec0-e631-4583-b508-14b1bc9be3b6-kube-api-access-wtbqf\") pod \"ingress-canary-jr7t5\" (UID: \"9fcafec0-e631-4583-b508-14b1bc9be3b6\") " pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.417669 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cphwm\" (UniqueName: \"kubernetes.io/projected/e558dddd-0d3c-46a2-aa57-33227ff3054d-kube-api-access-cphwm\") pod \"service-ca-operator-777779d784-m5w5f\" (UID: \"e558dddd-0d3c-46a2-aa57-33227ff3054d\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.421420 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.424660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.424996 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:43.92498045 +0000 UTC m=+155.814611047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.443751 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2lf\" (UniqueName: \"kubernetes.io/projected/e9709232-51d8-4109-8153-43c567908267-kube-api-access-zb2lf\") pod \"console-operator-58897d9998-zmlhc\" (UID: \"e9709232-51d8-4109-8153-43c567908267\") " pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.455041 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhxk\" (UniqueName: \"kubernetes.io/projected/00147e48-69ae-44af-8330-8cdf7618a470-kube-api-access-hkhxk\") pod \"machine-config-controller-84d6567774-z7j9r\" (UID: \"00147e48-69ae-44af-8330-8cdf7618a470\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.465825 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.469122 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.475843 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.477704 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxdt\" (UniqueName: \"kubernetes.io/projected/16564c2c-07e2-4d6e-9f35-fd14654d1538-kube-api-access-gxxdt\") pod \"console-f9d7485db-r7bcf\" (UID: \"16564c2c-07e2-4d6e-9f35-fd14654d1538\") " pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.483557 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.497847 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6078b99-d5d2-48ce-89c7-163eca80ff85-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vr5w\" (UID: \"f6078b99-d5d2-48ce-89c7-163eca80ff85\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.515841 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jr7t5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.518375 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jzv9l"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.530968 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lsczd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.533100 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.533672 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.533910 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.033864332 +0000 UTC m=+155.923494929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.534762 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.581499 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24db\" (UniqueName: \"kubernetes.io/projected/63a3d8f9-c505-4185-86fb-31eaf6c4cd72-kube-api-access-v24db\") pod \"machine-config-operator-74547568cd-4gs6j\" (UID: \"63a3d8f9-c505-4185-86fb-31eaf6c4cd72\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.583698 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbtx\" (UniqueName: \"kubernetes.io/projected/c6ea90f2-b0dc-4809-a02d-c44eda1431c2-kube-api-access-4pbtx\") pod \"csi-hostpathplugin-rgdm8\" (UID: \"c6ea90f2-b0dc-4809-a02d-c44eda1431c2\") " pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.590807 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.590862 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.600700 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np74x\" (UniqueName: \"kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x\") pod \"collect-profiles-29322090-dqdht\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.611096 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.613390 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.629147 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qts76\" (UniqueName: \"kubernetes.io/projected/03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010-kube-api-access-qts76\") pod \"dns-default-kk6dj\" (UID: \"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010\") " pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.632058 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmsf\" (UniqueName: \"kubernetes.io/projected/ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868-kube-api-access-9gmsf\") pod \"service-ca-9c57cc56f-xsqjd\" (UID: \"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868\") " pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.641184 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.641709 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.141698165 +0000 UTC m=+156.031328762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: W1001 13:39:43.659708 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6737d9a0_288f_44bf_a67a_0a7cb37e89f9.slice/crio-d9df90d57dd8997cf54afb141ff1b38664dc20fd95f0969ab1c2578eee3eb7e2 WatchSource:0}: Error finding container d9df90d57dd8997cf54afb141ff1b38664dc20fd95f0969ab1c2578eee3eb7e2: Status 404 returned error can't find the container with id d9df90d57dd8997cf54afb141ff1b38664dc20fd95f0969ab1c2578eee3eb7e2 Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.661822 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mh7\" (UniqueName: \"kubernetes.io/projected/2539b1bd-ceb0-4918-a20c-56775bc1bb17-kube-api-access-74mh7\") pod \"multus-admission-controller-857f4d67dd-j4nkp\" (UID: \"2539b1bd-ceb0-4918-a20c-56775bc1bb17\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.665499 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8fhh\" (UniqueName: \"kubernetes.io/projected/2aeefd1c-f8aa-483d-bf3e-424600e9557e-kube-api-access-r8fhh\") pod \"catalog-operator-68c6474976-vtvsl\" (UID: \"2aeefd1c-f8aa-483d-bf3e-424600e9557e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.696680 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.707960 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.715082 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.725440 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.725704 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.732625 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.734658 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.743380 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.743791 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.243775208 +0000 UTC m=+156.133405805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.744013 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.748022 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.754399 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.754715 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" event={"ID":"ed0e4912-6802-4b79-b7ed-aec44a875cfb","Type":"ContainerStarted","Data":"ab102e516f9df185f2f8d81c3e39a3e80166e6ddbe083bc19eec40b3f50c3309"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.774997 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9f8ft" event={"ID":"1c4215dd-b2d9-4617-9bc4-43536f0a06f6","Type":"ContainerStarted","Data":"aea451c920fa7bc0afaab0ae05c2f896ff7aa62b4de4a2e4196bd824c94ce93e"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.775296 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9f8ft" event={"ID":"1c4215dd-b2d9-4617-9bc4-43536f0a06f6","Type":"ContainerStarted","Data":"cb97fec0318163effa8d653e401d7164396d1c37619306fe3b4b97184f00bc62"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.776088 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.785336 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" event={"ID":"ce42be67-2e4d-4ca7-8ed8-5173d003c548","Type":"ContainerStarted","Data":"7513bdad3cb6437bf832085a55b3198e4d81cdfd91f2d687092378f86d7304a0"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.786928 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-9f8ft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.786959 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9f8ft" podUID="1c4215dd-b2d9-4617-9bc4-43536f0a06f6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.789290 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" event={"ID":"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6","Type":"ContainerStarted","Data":"497971d059a5296cdc5cd92977b858e66ecef674aa22a20fa72183bbaefa2938"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.791019 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" event={"ID":"1ae865cc-0785-4017-9e04-be7d244b0493","Type":"ContainerStarted","Data":"5731aca51d0d8b08cbd53e82c2be12c36340170c194419c4f6e4031838281663"} Oct 01 13:39:43 crc kubenswrapper[4774]: W1001 13:39:43.791343 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7539887f_d1c5_417e_aaf3_669de74c241d.slice/crio-6cc9b2069ae3629b31251a08ba970cec99be7c3cbfc0c4a25a602c0bcd7853f9 WatchSource:0}: Error finding container 6cc9b2069ae3629b31251a08ba970cec99be7c3cbfc0c4a25a602c0bcd7853f9: Status 404 returned error can't find the container with id 6cc9b2069ae3629b31251a08ba970cec99be7c3cbfc0c4a25a602c0bcd7853f9 Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.805766 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" event={"ID":"3ff0b1ac-8f04-4329-a5c7-cef871a84890","Type":"ContainerStarted","Data":"282625a062abb1ee75d4748c20fab55d16297aa58e062856e5e35b9030dc1580"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.806853 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.809051 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.817141 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-sfqhx"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.839444 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" event={"ID":"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3","Type":"ContainerStarted","Data":"4fdcb51ea02ca2da0d637992139ca51ecf6391f0ed4696a853393e34da6f2a34"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.844689 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.845135 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.34512467 +0000 UTC m=+156.234755267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.845486 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lsczd" event={"ID":"6737d9a0-288f-44bf-a67a-0a7cb37e89f9","Type":"ContainerStarted","Data":"d9df90d57dd8997cf54afb141ff1b38664dc20fd95f0969ab1c2578eee3eb7e2"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.846171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" event={"ID":"f03e45d8-27cd-4b18-87cf-e5abce7af1e2","Type":"ContainerStarted","Data":"a22c9dca26401e277d2f21a135c3aafff30f0e0836550d4aca1f2f373a9ad8ff"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.849747 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" event={"ID":"da7540b9-c6a0-41db-a094-486631000bdd","Type":"ContainerStarted","Data":"98e9339bd0d743849395e5036d7cd462a11d118af8d6ae03c7e3554e57590a12"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.849768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" event={"ID":"da7540b9-c6a0-41db-a094-486631000bdd","Type":"ContainerStarted","Data":"314b6ea89fab87448005d12bd6ee8415dbe293a7207fc927458445106b473af7"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.856183 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" event={"ID":"b19b5d90-1417-47fb-9c96-8558739656dc","Type":"ContainerStarted","Data":"7060bc7a77b8d78bf2a2580277a7f054aa67c03b076f1bba8a988df31fba9b73"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.856226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" event={"ID":"b19b5d90-1417-47fb-9c96-8558739656dc","Type":"ContainerStarted","Data":"28dd67cf6a41d5c0622603d34e2a3c5ea3cae88b62518aca9237eb712fe75d16"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.888277 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" event={"ID":"26f66ad2-cd4c-4352-a060-f115420788ab","Type":"ContainerStarted","Data":"8b4a1eb5012612782209d1af663ceeaf7fb1a0ee91efcc412169955b2dd30acb"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.888305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" event={"ID":"26f66ad2-cd4c-4352-a060-f115420788ab","Type":"ContainerStarted","Data":"8b94ce64fa53a71eb371ea2fdc356acc601cc3861d1f97e95ef0d22397029eee"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.888642 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.890716 4774 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4fcb5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.890756 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" podUID="26f66ad2-cd4c-4352-a060-f115420788ab" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.892147 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" event={"ID":"8b019ae2-a243-4cd5-bc3b-b0428c74df07","Type":"ContainerStarted","Data":"0a5d35090c1c6f1582c5f55cc9a857246206e19fee7c27d2383a2a226fee1cea"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.892173 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" event={"ID":"8b019ae2-a243-4cd5-bc3b-b0428c74df07","Type":"ContainerStarted","Data":"b9841f162cd55b1b88ec0071182710172f822038cd91e7a7419849331006fff6"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.894625 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" event={"ID":"80b19e22-85f2-482d-b4f9-525df7772776","Type":"ContainerStarted","Data":"57af623e4d38842f8fdeba9ed99a211bccb4e7df35a70730ad43f5784943c8c8"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.894646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" event={"ID":"80b19e22-85f2-482d-b4f9-525df7772776","Type":"ContainerStarted","Data":"0d75494b2c5a2d97b04c3b3b1f0bc0feb66279f30552cdc48b93f485fede2dcf"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.903360 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" event={"ID":"36ad3550-2755-4d27-8cfb-11b0c82f1bb0","Type":"ContainerStarted","Data":"2f913c97b514912d4070a5f948726af7f14889c68229f9d7860348e9d1df6410"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.920268 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" event={"ID":"91e70912-55cd-44d4-be6f-b6c637bec430","Type":"ContainerStarted","Data":"33506947c2f6ad8bd404e2949d011a46dd00c2bb4c7f886bc2b66e56d850b9db"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.920315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" event={"ID":"91e70912-55cd-44d4-be6f-b6c637bec430","Type":"ContainerStarted","Data":"4ce7f2bac482042e1321ece899e67ae2aa7dc1762d7b8846cec3863e7da31fe3"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.923468 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f"] Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.927827 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8gh62" event={"ID":"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823","Type":"ContainerStarted","Data":"6834c1a147bc888831d298eaddebc489764eca67d5f56da11b8e76d63776f9c0"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.935258 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" event={"ID":"212cd75f-356e-4ed5-a82a-98617024f18c","Type":"ContainerStarted","Data":"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc"} Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.941176 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.941238 4774 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wpmxq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.941260 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.950092 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:43 crc kubenswrapper[4774]: E1001 13:39:43.950277 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.450260977 +0000 UTC m=+156.339891564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:43 crc kubenswrapper[4774]: I1001 13:39:43.950766 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.051477 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.054376 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.554360556 +0000 UTC m=+156.443991153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: W1001 13:39:44.151033 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode558dddd_0d3c_46a2_aa57_33227ff3054d.slice/crio-b5bd22d10bd91afb5d3a2798c1356a7da41237a6ba0b398f3cdbc33295776cd0 WatchSource:0}: Error finding container b5bd22d10bd91afb5d3a2798c1356a7da41237a6ba0b398f3cdbc33295776cd0: Status 404 returned error can't find the container with id b5bd22d10bd91afb5d3a2798c1356a7da41237a6ba0b398f3cdbc33295776cd0 Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.155795 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.155933 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.155999 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.156575 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.656548562 +0000 UTC m=+156.546179159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.157122 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-audit-policies\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.157251 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b69e1571-8ffe-4425-917c-bb7021c3c74b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnt9h\" (UID: \"b69e1571-8ffe-4425-917c-bb7021c3c74b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.262375 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.262699 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.762684167 +0000 UTC m=+156.652314764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.277170 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.370301 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.370901 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.371103 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.871087556 +0000 UTC m=+156.760718153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.372571 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zmlhc"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.387253 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r7bcf"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.414106 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.416868 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.438739 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jr7t5"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.471966 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.472556 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:44.972543371 +0000 UTC m=+156.862173968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: W1001 13:39:44.495083 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16564c2c_07e2_4d6e_9f35_fd14654d1538.slice/crio-089a3aedf89a7448a44b0e4a787428e2ba15058e37e737693624dc1dad419113 WatchSource:0}: Error finding container 089a3aedf89a7448a44b0e4a787428e2ba15058e37e737693624dc1dad419113: Status 404 returned error can't find the container with id 089a3aedf89a7448a44b0e4a787428e2ba15058e37e737693624dc1dad419113 Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.577788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.578582 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.078557883 +0000 UTC m=+156.968188480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: W1001 13:39:44.611988 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9709232_51d8_4109_8153_43c567908267.slice/crio-72801e46f8a044422e1d3ecc37d49fd7d25b64cad80637ef02eb8a21dc35153c WatchSource:0}: Error finding container 72801e46f8a044422e1d3ecc37d49fd7d25b64cad80637ef02eb8a21dc35153c: Status 404 returned error can't find the container with id 72801e46f8a044422e1d3ecc37d49fd7d25b64cad80637ef02eb8a21dc35153c Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.681541 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.681880 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.18186954 +0000 UTC m=+157.071500137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.782556 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.782652 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.282628815 +0000 UTC m=+157.172259412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.783115 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.783364 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.283352326 +0000 UTC m=+157.172982923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.788871 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgdm8"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.824746 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j4nkp"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.864909 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kjxqs" podStartSLOduration=133.864883094 podStartE2EDuration="2m13.864883094s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:44.854033051 +0000 UTC m=+156.743663638" watchObservedRunningTime="2025-10-01 13:39:44.864883094 +0000 UTC m=+156.754513691" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.893551 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.893763 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.39374978 +0000 UTC m=+157.283380377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.908512 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht"] Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.974967 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" event={"ID":"e558dddd-0d3c-46a2-aa57-33227ff3054d","Type":"ContainerStarted","Data":"b5bd22d10bd91afb5d3a2798c1356a7da41237a6ba0b398f3cdbc33295776cd0"} Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.977348 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" event={"ID":"8b019ae2-a243-4cd5-bc3b-b0428c74df07","Type":"ContainerStarted","Data":"fb6c2c79521ec01d932cfad05de8c844be889d8b57d373d7808bcd8d895f59a3"} Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.977981 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.979166 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" event={"ID":"e9709232-51d8-4109-8153-43c567908267","Type":"ContainerStarted","Data":"72801e46f8a044422e1d3ecc37d49fd7d25b64cad80637ef02eb8a21dc35153c"} Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.993186 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7crv" podStartSLOduration=133.993169678 podStartE2EDuration="2m13.993169678s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:44.944267322 +0000 UTC m=+156.833897919" watchObservedRunningTime="2025-10-01 13:39:44.993169678 +0000 UTC m=+156.882800275" Oct 01 13:39:44 crc kubenswrapper[4774]: I1001 13:39:44.994528 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:44 crc kubenswrapper[4774]: E1001 13:39:44.994909 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.494897097 +0000 UTC m=+157.384527694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.019817 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" event={"ID":"3ff0b1ac-8f04-4329-a5c7-cef871a84890","Type":"ContainerStarted","Data":"8e2dbcee5ab7c671a29209c8953c70c348abb2c8376323a69de5e31ac681ee3e"} Oct 01 13:39:45 crc kubenswrapper[4774]: W1001 13:39:45.028398 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f2d4ce_a586_48bb_b821_71d8a9a988f8.slice/crio-9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8 WatchSource:0}: Error finding container 9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8: Status 404 returned error can't find the container with id 9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8 Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.045871 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jr7t5" event={"ID":"9fcafec0-e631-4583-b508-14b1bc9be3b6","Type":"ContainerStarted","Data":"6c6f06c4488a978cd03ebd54d03d8f4add1e275a0a0d251279dc1f90f120fce7"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.046808 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9f8ft" podStartSLOduration=134.046798767 podStartE2EDuration="2m14.046798767s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:44.995475523 +0000 UTC m=+156.885106110" watchObservedRunningTime="2025-10-01 13:39:45.046798767 +0000 UTC m=+156.936429354" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.061382 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.085342 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" event={"ID":"08c4a4cd-1564-42d3-a19f-0ef17b65d5be","Type":"ContainerStarted","Data":"8a407c40d83b1d8540ae1fabf5fd9de36f8628538200349362030e5715fe4be2"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.099707 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.100780 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.600701163 +0000 UTC m=+157.490331790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.130686 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kgj6n" podStartSLOduration=134.13066094 podStartE2EDuration="2m14.13066094s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.12061638 +0000 UTC m=+157.010246977" watchObservedRunningTime="2025-10-01 13:39:45.13066094 +0000 UTC m=+157.020291547" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.146922 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8gh62" event={"ID":"ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823","Type":"ContainerStarted","Data":"03d0680c82310c752ed38f6f5dd3e7a58eaf3c6d5846a5f96820c9286c9ec127"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.156306 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" event={"ID":"2539b1bd-ceb0-4918-a20c-56775bc1bb17","Type":"ContainerStarted","Data":"40c046b53fd94e8fc76757365d11a8d10483f49436eb0206ff4f5e7e93622d29"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.181427 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" podStartSLOduration=134.181409108 podStartE2EDuration="2m14.181409108s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.172075418 +0000 UTC m=+157.061706015" watchObservedRunningTime="2025-10-01 13:39:45.181409108 +0000 UTC m=+157.071039705" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.188312 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.189274 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" event={"ID":"d7b8192d-4515-4ac3-a253-b245bb57c64e","Type":"ContainerStarted","Data":"bb9104bcaf6aac2db91c2d5080e5912c26162de4abab54d4082315508457d0b5"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.189310 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" event={"ID":"d7b8192d-4515-4ac3-a253-b245bb57c64e","Type":"ContainerStarted","Data":"9443e9afc1f352d3dc877bfc4f4e9d942e2a99f8ac353767d1cac06a333b135d"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.190618 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.197549 4774 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-9ljx9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.197601 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" podUID="d7b8192d-4515-4ac3-a253-b245bb57c64e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.202616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.204468 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.704438492 +0000 UTC m=+157.594069089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.244946 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" event={"ID":"1ae865cc-0785-4017-9e04-be7d244b0493","Type":"ContainerStarted","Data":"5fc19879e5b11f8bcfed0209a8a40d74f1722b462b46ec39311dd26e46154a33"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.251432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" event={"ID":"c6ea90f2-b0dc-4809-a02d-c44eda1431c2","Type":"ContainerStarted","Data":"65a88b93843ae9dc09a16becb67e5a77f6ac52dd866d6d6af0967f5fc58f093f"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.267641 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.267740 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" podStartSLOduration=134.26772991 podStartE2EDuration="2m14.26772991s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.255855159 +0000 UTC m=+157.145485756" watchObservedRunningTime="2025-10-01 13:39:45.26772991 +0000 UTC m=+157.157360507" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.276124 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" event={"ID":"7539887f-d1c5-417e-aaf3-669de74c241d","Type":"ContainerStarted","Data":"c1bb4310a9e92552cf795956c5bf4f6adf9eeea35fe4042274dfd29016a09ccc"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.276362 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" event={"ID":"7539887f-d1c5-417e-aaf3-669de74c241d","Type":"ContainerStarted","Data":"6cc9b2069ae3629b31251a08ba970cec99be7c3cbfc0c4a25a602c0bcd7853f9"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.305009 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.306894 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.806859734 +0000 UTC m=+157.696490331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.314957 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" event={"ID":"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f","Type":"ContainerStarted","Data":"e8dd1dd78c6cec15fa35cd4397098f694747b372d6c1d2a97ce73c9c91b8de45"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.314993 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" event={"ID":"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f","Type":"ContainerStarted","Data":"e42f99d373d7b046cc721a295476edad5d64a3ec87ec5fb0cea1974a119ba912"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.321017 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" event={"ID":"ce42be67-2e4d-4ca7-8ed8-5173d003c548","Type":"ContainerStarted","Data":"37adff8b29a3233938c5b12df97b1c769f5ea5e130307fe8e502a19b32df1510"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.377865 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.405703 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" event={"ID":"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6","Type":"ContainerStarted","Data":"99790d2faf60f93a9d2fcd7bd8c63a8c2bce875a95f85b7917c44a09c46a0bab"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.406952 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.407508 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.407753 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:45.907743193 +0000 UTC m=+157.797373790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.408480 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kxbs7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.408596 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.411942 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:45 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:45 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:45 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.412229 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.424769 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" event={"ID":"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e","Type":"ContainerStarted","Data":"2203e9859c77c9adb2823a22f914c260ddeb75e84eb96f692daf4b9bea5859ed"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.431762 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" event={"ID":"36ad3550-2755-4d27-8cfb-11b0c82f1bb0","Type":"ContainerStarted","Data":"321dee891b5723421a9c67148ad06c90db31fb34c41e7a74e18ee88e0ea178bd"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.442152 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" podStartSLOduration=134.442135274 podStartE2EDuration="2m14.442135274s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.440399185 +0000 UTC m=+157.330029782" watchObservedRunningTime="2025-10-01 13:39:45.442135274 +0000 UTC m=+157.331765861" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.450005 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" event={"ID":"f03e45d8-27cd-4b18-87cf-e5abce7af1e2","Type":"ContainerStarted","Data":"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.450996 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.452765 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.456973 4774 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-chw44 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.458869 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.458891 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r7bcf" event={"ID":"16564c2c-07e2-4d6e-9f35-fd14654d1538","Type":"ContainerStarted","Data":"089a3aedf89a7448a44b0e4a787428e2ba15058e37e737693624dc1dad419113"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.464111 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" event={"ID":"ed0e4912-6802-4b79-b7ed-aec44a875cfb","Type":"ContainerStarted","Data":"7b13074192bfaed81eb30a0ae9d355c0a5cc2a52ee6fb9f9d4699f0e3a9ab2a9"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.472515 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" event={"ID":"00147e48-69ae-44af-8330-8cdf7618a470","Type":"ContainerStarted","Data":"2e504d5a18e47818abeb7abdab45541025b5f6cba3022e3ff43bfff711d3f531"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.485036 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" event={"ID":"b719c2e8-d04c-4b7e-998c-643f5b166d13","Type":"ContainerStarted","Data":"e7214e28f8e4f5d581eabea83795ffa6a9b7f5f54356303d40c21508530971ae"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.485374 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" event={"ID":"b719c2e8-d04c-4b7e-998c-643f5b166d13","Type":"ContainerStarted","Data":"733edd74275d1bf3e18c9dd7269bd372aac1cab348e2f803431df4ed3210d5e6"} Oct 01 13:39:45 crc kubenswrapper[4774]: W1001 13:39:45.489650 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea9fe5ef_1a76_4ef3_83d9_d6169a4cc868.slice/crio-87377ca1f81792ccbf04d559b1f3ecd50e739102e94f3708d333d8e2fbd04023 WatchSource:0}: Error finding container 87377ca1f81792ccbf04d559b1f3ecd50e739102e94f3708d333d8e2fbd04023: Status 404 returned error can't find the container with id 87377ca1f81792ccbf04d559b1f3ecd50e739102e94f3708d333d8e2fbd04023 Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.490971 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xsqjd"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.494772 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" event={"ID":"f7313744-98b0-4a56-bd19-280760be765f","Type":"ContainerStarted","Data":"e5e7395a44cb1ccadc1b3ac0570e5b63dea72d7557d10be247bd0e0b6d6f3533"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.495100 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kk6dj"] Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.511148 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.511524 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.011508882 +0000 UTC m=+157.901139479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.513877 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.512795 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-kjwcn" podStartSLOduration=134.512786168 podStartE2EDuration="2m14.512786168s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.512589943 +0000 UTC m=+157.402220540" watchObservedRunningTime="2025-10-01 13:39:45.512786168 +0000 UTC m=+157.402416765" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.511864 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" event={"ID":"c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3","Type":"ContainerStarted","Data":"92c24efcff84f96979c9a89ba0b93ce158e4a010c2f46e7c614437f7807d89cf"} Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.518955 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.01893384 +0000 UTC m=+157.908564527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.525906 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lsczd" event={"ID":"6737d9a0-288f-44bf-a67a-0a7cb37e89f9","Type":"ContainerStarted","Data":"35fd369f1c10d34f87510a2a33e902a0be1199f3927efb7d43cb5d283f090bc6"} Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.529502 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-9f8ft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.529548 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9f8ft" podUID="1c4215dd-b2d9-4617-9bc4-43536f0a06f6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.538116 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.548745 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zwfjn" podStartSLOduration=134.548724332 podStartE2EDuration="2m14.548724332s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.548119775 +0000 UTC m=+157.437750392" watchObservedRunningTime="2025-10-01 13:39:45.548724332 +0000 UTC m=+157.438354949" Oct 01 13:39:45 crc kubenswrapper[4774]: W1001 13:39:45.580136 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb69e1571_8ffe_4425_917c_bb7021c3c74b.slice/crio-6a2a8ec3757c1c464d6f4922b6fa7253f50981f427aa1603b63bf9098a443cde WatchSource:0}: Error finding container 6a2a8ec3757c1c464d6f4922b6fa7253f50981f427aa1603b63bf9098a443cde: Status 404 returned error can't find the container with id 6a2a8ec3757c1c464d6f4922b6fa7253f50981f427aa1603b63bf9098a443cde Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.603581 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" podStartSLOduration=135.603568275 podStartE2EDuration="2m15.603568275s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.602773853 +0000 UTC m=+157.492404440" watchObservedRunningTime="2025-10-01 13:39:45.603568275 +0000 UTC m=+157.493198872" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.610894 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4fcb5" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.614532 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.614795 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.114768608 +0000 UTC m=+158.004399205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.615207 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.616131 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.116119036 +0000 UTC m=+158.005749623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.639968 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" podStartSLOduration=134.639953852 podStartE2EDuration="2m14.639953852s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.638661175 +0000 UTC m=+157.528291772" watchObservedRunningTime="2025-10-01 13:39:45.639953852 +0000 UTC m=+157.529584449" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.716987 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.717507 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.217433297 +0000 UTC m=+158.107063894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.764967 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6z77g" podStartSLOduration=135.764949844 podStartE2EDuration="2m15.764949844s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.695772841 +0000 UTC m=+157.585403438" watchObservedRunningTime="2025-10-01 13:39:45.764949844 +0000 UTC m=+157.654580441" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.820184 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.820522 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.320508097 +0000 UTC m=+158.210138694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.828044 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8gh62" podStartSLOduration=134.828028537 podStartE2EDuration="2m14.828028537s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.766664032 +0000 UTC m=+157.656294629" watchObservedRunningTime="2025-10-01 13:39:45.828028537 +0000 UTC m=+157.717659124" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.829641 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" podStartSLOduration=135.829632112 podStartE2EDuration="2m15.829632112s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.825853906 +0000 UTC m=+157.715484503" watchObservedRunningTime="2025-10-01 13:39:45.829632112 +0000 UTC m=+157.719262709" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.928576 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.928722 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.42869882 +0000 UTC m=+158.318329417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.928852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:45 crc kubenswrapper[4774]: E1001 13:39:45.929254 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.429241025 +0000 UTC m=+158.318871622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.939846 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzdb5" podStartSLOduration=134.939827291 podStartE2EDuration="2m14.939827291s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.897141788 +0000 UTC m=+157.786772385" watchObservedRunningTime="2025-10-01 13:39:45.939827291 +0000 UTC m=+157.829457888" Oct 01 13:39:45 crc kubenswrapper[4774]: I1001 13:39:45.942115 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" podStartSLOduration=134.942104305 podStartE2EDuration="2m14.942104305s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.941078976 +0000 UTC m=+157.830709573" watchObservedRunningTime="2025-10-01 13:39:45.942104305 +0000 UTC m=+157.831734902" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.003187 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6dcpv" podStartSLOduration=135.003169061 podStartE2EDuration="2m15.003169061s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:45.999592841 +0000 UTC m=+157.889223438" watchObservedRunningTime="2025-10-01 13:39:46.003169061 +0000 UTC m=+157.892799658" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.030774 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.031018 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.531003629 +0000 UTC m=+158.420634226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.098079 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" podStartSLOduration=135.098060932 podStartE2EDuration="2m15.098060932s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.064183036 +0000 UTC m=+157.953813643" watchObservedRunningTime="2025-10-01 13:39:46.098060932 +0000 UTC m=+157.987691529" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.133113 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.133428 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.63341628 +0000 UTC m=+158.523046877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.181111 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jzv9l" podStartSLOduration=136.181093393 podStartE2EDuration="2m16.181093393s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.180904847 +0000 UTC m=+158.070535454" watchObservedRunningTime="2025-10-01 13:39:46.181093393 +0000 UTC m=+158.070723980" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.182736 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lsczd" podStartSLOduration=6.182729728 podStartE2EDuration="6.182729728s" podCreationTimestamp="2025-10-01 13:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.097730333 +0000 UTC m=+157.987360930" watchObservedRunningTime="2025-10-01 13:39:46.182729728 +0000 UTC m=+158.072360315" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.244084 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.244181 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.744157685 +0000 UTC m=+158.633788282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.244283 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.244618 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.744604377 +0000 UTC m=+158.634234974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.344748 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.353147 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.853106359 +0000 UTC m=+158.742737006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.387088 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:46 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:46 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:46 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.387129 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.447929 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.448206 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:46.948194676 +0000 UTC m=+158.837825263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.546697 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" event={"ID":"1ae865cc-0785-4017-9e04-be7d244b0493","Type":"ContainerStarted","Data":"46301723b89eff88c7ac432ee3547d8ef7e09a2dcc3e671ef0fd3e46ab4489f8"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.552508 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.552714 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.052700176 +0000 UTC m=+158.942330773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.557943 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" event={"ID":"e6f2d4ce-a586-48bb-b821-71d8a9a988f8","Type":"ContainerStarted","Data":"3354288fa35a9773464f7399acee45b3a9513b6dd5f49f515624373bba853ab6"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.557985 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" event={"ID":"e6f2d4ce-a586-48bb-b821-71d8a9a988f8","Type":"ContainerStarted","Data":"9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.589586 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" event={"ID":"f6078b99-d5d2-48ce-89c7-163eca80ff85","Type":"ContainerStarted","Data":"9ba8f47f11846ec45d2bb284e40f557b86f302181d004943e07418a3382ce1b6"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.590738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" event={"ID":"00147e48-69ae-44af-8330-8cdf7618a470","Type":"ContainerStarted","Data":"6d01560deb6cb145de986cdfdc1314b14a9268cf645a588905407073afb39922"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.590763 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" event={"ID":"00147e48-69ae-44af-8330-8cdf7618a470","Type":"ContainerStarted","Data":"8cd2871f6368a612d0816f71da8ba98dfe9d11c4a8681d2179d1b5555974c280"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.592209 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" event={"ID":"3ff0b1ac-8f04-4329-a5c7-cef871a84890","Type":"ContainerStarted","Data":"48b3905ad4a8640557f2b80da3002b5dca4873969a908a85fb38065287bb0d8a"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.594006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" event={"ID":"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868","Type":"ContainerStarted","Data":"2f6bc6337da15f6114c60cff78d483169781e53661160bc6929b6e862f35cddc"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.594031 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" event={"ID":"ea9fe5ef-1a76-4ef3-83d9-d6169a4cc868","Type":"ContainerStarted","Data":"87377ca1f81792ccbf04d559b1f3ecd50e739102e94f3708d333d8e2fbd04023"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.595081 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" event={"ID":"e558dddd-0d3c-46a2-aa57-33227ff3054d","Type":"ContainerStarted","Data":"98bd24f853c18172249fe5e70e20d15c05879b2567cd908c2bf48f0c26d5789b"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.613614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" event={"ID":"08c4a4cd-1564-42d3-a19f-0ef17b65d5be","Type":"ContainerStarted","Data":"1543ebecc87e913096e9d9b23fd8b1cfe4884b881dc17add5af4de8a29f5eeca"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.613672 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" event={"ID":"08c4a4cd-1564-42d3-a19f-0ef17b65d5be","Type":"ContainerStarted","Data":"68a93448b7265d8cf621bb85067c9ffe1334caa48e9e7202ac932404adb2e0de"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.616317 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kk6dj" event={"ID":"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010","Type":"ContainerStarted","Data":"511ff6ea6081a65156ddb711b7f3b41bf5aabe8a2248aa28b0ade55ef194668e"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.617157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" event={"ID":"63a3d8f9-c505-4185-86fb-31eaf6c4cd72","Type":"ContainerStarted","Data":"bc67dbde75ed47e42bffc361ffb95ca2d692dd164e36d35f02fb7638218940a9"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.617185 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" event={"ID":"63a3d8f9-c505-4185-86fb-31eaf6c4cd72","Type":"ContainerStarted","Data":"e18aec8a83cc22a036b44763e06bb11f999d9af0a72463e244c2904f413c9f8b"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.625995 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" event={"ID":"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e","Type":"ContainerStarted","Data":"d0c2f9ecea21829750d85ec5244c5608d1f00dd06c8dbe26ca1b0f557eb92ecd"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.627255 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g4z9v" podStartSLOduration=135.627243089 podStartE2EDuration="2m15.627243089s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.589362471 +0000 UTC m=+158.478993068" watchObservedRunningTime="2025-10-01 13:39:46.627243089 +0000 UTC m=+158.516873706" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.627952 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" podStartSLOduration=135.627946459 podStartE2EDuration="2m15.627946459s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.625784609 +0000 UTC m=+158.515415206" watchObservedRunningTime="2025-10-01 13:39:46.627946459 +0000 UTC m=+158.517577046" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.646048 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" event={"ID":"f7313744-98b0-4a56-bd19-280760be765f","Type":"ContainerStarted","Data":"cfa8432510c8163eff022be09304ca9b86914d2597428d04574bace909c3f141"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.655125 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.656834 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.156816936 +0000 UTC m=+159.046447533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.657740 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" event={"ID":"b69e1571-8ffe-4425-917c-bb7021c3c74b","Type":"ContainerStarted","Data":"6a2a8ec3757c1c464d6f4922b6fa7253f50981f427aa1603b63bf9098a443cde"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.671565 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" event={"ID":"2aeefd1c-f8aa-483d-bf3e-424600e9557e","Type":"ContainerStarted","Data":"d06527be91644052b2db849efb1809ef01498bd8eef74790da0f65a802033fb9"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.671612 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" event={"ID":"2aeefd1c-f8aa-483d-bf3e-424600e9557e","Type":"ContainerStarted","Data":"6d5f87e5400ce95503778d892151305c2f6a83bf4c8c75bde8ff0c2eac9c8d28"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.672274 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.673522 4774 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vtvsl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.673585 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" podUID="2aeefd1c-f8aa-483d-bf3e-424600e9557e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.674352 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ddtd2" podStartSLOduration=136.674337775 podStartE2EDuration="2m16.674337775s" podCreationTimestamp="2025-10-01 13:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.667145744 +0000 UTC m=+158.556776351" watchObservedRunningTime="2025-10-01 13:39:46.674337775 +0000 UTC m=+158.563968372" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.676348 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jr7t5" event={"ID":"9fcafec0-e631-4583-b508-14b1bc9be3b6","Type":"ContainerStarted","Data":"798d659fa265d02fc656184ee5c2a65d71161d9fade070bd6389e7d992842e36"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.687944 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" event={"ID":"e9709232-51d8-4109-8153-43c567908267","Type":"ContainerStarted","Data":"b5a56da377a15c09de4e4fe5b7eaa300be428feeec6a164d4fe83372689daa29"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.689588 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.696654 4774 patch_prober.go:28] interesting pod/console-operator-58897d9998-zmlhc container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.696714 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" podUID="e9709232-51d8-4109-8153-43c567908267" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/readyz\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.697425 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-sfqhx" podStartSLOduration=135.69741026 podStartE2EDuration="2m15.69741026s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.695046894 +0000 UTC m=+158.584677491" watchObservedRunningTime="2025-10-01 13:39:46.69741026 +0000 UTC m=+158.587040857" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.712373 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r7bcf" event={"ID":"16564c2c-07e2-4d6e-9f35-fd14654d1538","Type":"ContainerStarted","Data":"9fbb63aab0e29c353b7714c5d2729b90372190f98d222c8349c8c8064ad69b21"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.759024 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.759419 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.259394002 +0000 UTC m=+159.149024599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.761469 4774 generic.go:334] "Generic (PLEG): container finished" podID="a0c0ab2f-639f-4892-80a8-d0ed090e6d5f" containerID="e8dd1dd78c6cec15fa35cd4397098f694747b372d6c1d2a97ce73c9c91b8de45" exitCode=0 Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.761736 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" event={"ID":"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f","Type":"ContainerDied","Data":"e8dd1dd78c6cec15fa35cd4397098f694747b372d6c1d2a97ce73c9c91b8de45"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.761792 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" event={"ID":"a0c0ab2f-639f-4892-80a8-d0ed090e6d5f","Type":"ContainerStarted","Data":"c4f70d88fa7397ad9174990ffb55dd9d43a0cee9de859d9b3b568783c23e5170"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.761818 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.770130 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7j9r" podStartSLOduration=135.770111522 podStartE2EDuration="2m15.770111522s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.72316471 +0000 UTC m=+158.612795307" watchObservedRunningTime="2025-10-01 13:39:46.770111522 +0000 UTC m=+158.659742119" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.779068 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" event={"ID":"b719c2e8-d04c-4b7e-998c-643f5b166d13","Type":"ContainerStarted","Data":"fec5048f89874a400dc7ddf636b3328acf43cc85d913d1ffb3ca3aa5f6a5813b"} Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.786977 4774 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kxbs7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.787023 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.793237 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-9f8ft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.793287 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9f8ft" podUID="1c4215dd-b2d9-4617-9bc4-43536f0a06f6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.827144 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5w5f" podStartSLOduration=135.827129685 podStartE2EDuration="2m15.827129685s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.825471518 +0000 UTC m=+158.715102125" watchObservedRunningTime="2025-10-01 13:39:46.827129685 +0000 UTC m=+158.716760272" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.830063 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" podStartSLOduration=135.830051286 podStartE2EDuration="2m15.830051286s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.771671155 +0000 UTC m=+158.661301752" watchObservedRunningTime="2025-10-01 13:39:46.830051286 +0000 UTC m=+158.719681873" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.862525 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:46 crc kubenswrapper[4774]: E1001 13:39:46.862780 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.362764761 +0000 UTC m=+159.252395358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.873665 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-9ljx9" Oct 01 13:39:46 crc kubenswrapper[4774]: I1001 13:39:46.966646 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xsqjd" podStartSLOduration=135.966629673 podStartE2EDuration="2m15.966629673s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.873068388 +0000 UTC m=+158.762698985" watchObservedRunningTime="2025-10-01 13:39:46.966629673 +0000 UTC m=+158.856260270" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.005441 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" podStartSLOduration=136.005425287 podStartE2EDuration="2m16.005425287s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:46.95222488 +0000 UTC m=+158.841855487" watchObservedRunningTime="2025-10-01 13:39:47.005425287 +0000 UTC m=+158.895055884" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.006546 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" podStartSLOduration=136.006542838 podStartE2EDuration="2m16.006542838s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.004201543 +0000 UTC m=+158.893832140" watchObservedRunningTime="2025-10-01 13:39:47.006542838 +0000 UTC m=+158.896173435" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.008864 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.009130 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.50911828 +0000 UTC m=+159.398748877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.037567 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-r7bcf" podStartSLOduration=136.037551865 podStartE2EDuration="2m16.037551865s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.037051081 +0000 UTC m=+158.926681678" watchObservedRunningTime="2025-10-01 13:39:47.037551865 +0000 UTC m=+158.927182452" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.113336 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.113710 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.613696572 +0000 UTC m=+159.503327169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.174933 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" podStartSLOduration=136.174920263 podStartE2EDuration="2m16.174920263s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.171859837 +0000 UTC m=+159.061490444" watchObservedRunningTime="2025-10-01 13:39:47.174920263 +0000 UTC m=+159.064550860" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.175024 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jr7t5" podStartSLOduration=7.175021466 podStartE2EDuration="7.175021466s" podCreationTimestamp="2025-10-01 13:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.098327623 +0000 UTC m=+158.987958240" watchObservedRunningTime="2025-10-01 13:39:47.175021466 +0000 UTC m=+159.064652053" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.197535 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.197891 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.213828 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hcsr7" podStartSLOduration=136.21381469 podStartE2EDuration="2m16.21381469s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.211994619 +0000 UTC m=+159.101625216" watchObservedRunningTime="2025-10-01 13:39:47.21381469 +0000 UTC m=+159.103445287" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.216944 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.217315 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.717299837 +0000 UTC m=+159.606930424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.226567 4774 patch_prober.go:28] interesting pod/apiserver-76f77b778f-nndfg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]log ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]etcd ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/max-in-flight-filter ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 13:39:47 crc kubenswrapper[4774]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/openshift.io-startinformers ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 13:39:47 crc kubenswrapper[4774]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 13:39:47 crc kubenswrapper[4774]: livez check failed Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.226634 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" podUID="c1ff7b1d-7ac6-4fa8-8003-b2f68a12b2b3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.273254 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hz99" podStartSLOduration=136.2732239 podStartE2EDuration="2m16.2732239s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.271057599 +0000 UTC m=+159.160688206" watchObservedRunningTime="2025-10-01 13:39:47.2732239 +0000 UTC m=+159.162854497" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.317843 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.318105 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.818092654 +0000 UTC m=+159.707723251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.379862 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:47 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:47 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:47 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.379930 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.418752 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.418995 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.918964932 +0000 UTC m=+159.808595529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.419242 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.419573 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:47.919560679 +0000 UTC m=+159.809191276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.520943 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.521129 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.021103806 +0000 UTC m=+159.910734403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.521208 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.521549 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.021541299 +0000 UTC m=+159.911171896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.596322 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.622255 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.622478 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.122432488 +0000 UTC m=+160.012063085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.622512 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.622873 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.122844329 +0000 UTC m=+160.012474926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.728579 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.728843 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.228827821 +0000 UTC m=+160.118458418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.785702 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kk6dj" event={"ID":"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010","Type":"ContainerStarted","Data":"6673d4a30cab6a4261ea1baaa6ace9047aa1c635d8cd47c7a1b8d1266d45615a"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.785749 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kk6dj" event={"ID":"03c4eff0-6afe-44eb-a9cd-1e2ab9ef2010","Type":"ContainerStarted","Data":"d12625f5e57e00c5472303e78c2cfb3d769deb7e09459342dfe006b9ca6a8caa"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.785818 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-kk6dj" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.787817 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" event={"ID":"63a3d8f9-c505-4185-86fb-31eaf6c4cd72","Type":"ContainerStarted","Data":"1a2665e57941305734d839188f89714c6c8ec9a7f9b234e257361ae5a3e8b06f"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.790345 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" event={"ID":"cd5a9b77-d066-401b-a7c2-0c331cc8dd2e","Type":"ContainerStarted","Data":"7ee050089e00c807ffae3469dd332cb813e5ca5d5c8c866ebe570300c04b5cb8"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.796898 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" event={"ID":"2539b1bd-ceb0-4918-a20c-56775bc1bb17","Type":"ContainerStarted","Data":"5d38c4a0aa2f9e46d8ce9179fe56ee86f5a84da6458d668c3fd4d6e19a33e066"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.796954 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" event={"ID":"2539b1bd-ceb0-4918-a20c-56775bc1bb17","Type":"ContainerStarted","Data":"ed320858979497550f3794991cd4b63bf01a3841fd7fb8fc1c662f5959965c45"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.798958 4774 generic.go:334] "Generic (PLEG): container finished" podID="b69e1571-8ffe-4425-917c-bb7021c3c74b" containerID="77542286cc732c1e04f2b238caac5445f3f62707fb9a5f4e7f24948fb2edc342" exitCode=0 Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.799028 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" event={"ID":"b69e1571-8ffe-4425-917c-bb7021c3c74b","Type":"ContainerDied","Data":"77542286cc732c1e04f2b238caac5445f3f62707fb9a5f4e7f24948fb2edc342"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.804413 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" event={"ID":"c6ea90f2-b0dc-4809-a02d-c44eda1431c2","Type":"ContainerStarted","Data":"e1a2eb55b717a5fec21a833ec48c1c5eb8995acbcd6a2481c728ce0712e039c9"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.809025 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vr5w" event={"ID":"f6078b99-d5d2-48ce-89c7-163eca80ff85","Type":"ContainerStarted","Data":"cf7c99fc584a399b7cd4bea52782cceb8adecfe888633e48935d4373b5a008ca"} Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.809796 4774 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vtvsl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.809845 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" podUID="2aeefd1c-f8aa-483d-bf3e-424600e9557e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.819079 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.829619 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.829950 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.329939276 +0000 UTC m=+160.219569873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.836883 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kk6dj" podStartSLOduration=7.83686674 podStartE2EDuration="7.83686674s" podCreationTimestamp="2025-10-01 13:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.831633694 +0000 UTC m=+159.721264311" watchObservedRunningTime="2025-10-01 13:39:47.83686674 +0000 UTC m=+159.726497337" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.931795 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.932005 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.431967446 +0000 UTC m=+160.321598043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.932815 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:47 crc kubenswrapper[4774]: E1001 13:39:47.935955 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.435938217 +0000 UTC m=+160.325568814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.959366 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-j4nkp" podStartSLOduration=136.959350861 podStartE2EDuration="2m16.959350861s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.958259061 +0000 UTC m=+159.847889648" watchObservedRunningTime="2025-10-01 13:39:47.959350861 +0000 UTC m=+159.848981458" Oct 01 13:39:47 crc kubenswrapper[4774]: I1001 13:39:47.960662 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ldzzw" podStartSLOduration=136.960653638 podStartE2EDuration="2m16.960653638s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:47.902257266 +0000 UTC m=+159.791887863" watchObservedRunningTime="2025-10-01 13:39:47.960653638 +0000 UTC m=+159.850284235" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.035552 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.036179 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.536162948 +0000 UTC m=+160.425793535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.045281 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gs6j" podStartSLOduration=137.045267082 podStartE2EDuration="2m17.045267082s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:48.044987704 +0000 UTC m=+159.934618301" watchObservedRunningTime="2025-10-01 13:39:48.045267082 +0000 UTC m=+159.934897679" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.105504 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zmlhc" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.144970 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.145287 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.645273807 +0000 UTC m=+160.534904424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.246530 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.246812 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.746795933 +0000 UTC m=+160.636426530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.348345 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.348716 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.848701081 +0000 UTC m=+160.738331678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.378242 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:48 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:48 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:48 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.378297 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.449994 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.450283 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:48.950268309 +0000 UTC m=+160.839898906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.551547 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.551823 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.051812707 +0000 UTC m=+160.941443294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.653102 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.653434 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.153420596 +0000 UTC m=+161.043051193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.758127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.758394 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.258382859 +0000 UTC m=+161.148013456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.815734 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" event={"ID":"b69e1571-8ffe-4425-917c-bb7021c3c74b","Type":"ContainerStarted","Data":"d8cb0ac27e4be769f979a2a635959d7ced8c386634ae3b473e4ec8ae1125c13c"} Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.818693 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" event={"ID":"c6ea90f2-b0dc-4809-a02d-c44eda1431c2","Type":"ContainerStarted","Data":"a6e9f222da5d3a116f0d846612f23f235ea5dd1af7dc9692b3852013140edf5a"} Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.824206 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vtvsl" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.839221 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" podStartSLOduration=137.839207107 podStartE2EDuration="2m17.839207107s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:48.837543151 +0000 UTC m=+160.727173758" watchObservedRunningTime="2025-10-01 13:39:48.839207107 +0000 UTC m=+160.728837704" Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.859614 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.859734 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.35971075 +0000 UTC m=+161.249341347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.860553 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.860976 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.360964235 +0000 UTC m=+161.250594832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.962132 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.962400 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.462375049 +0000 UTC m=+161.352005636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:48 crc kubenswrapper[4774]: I1001 13:39:48.962744 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:48 crc kubenswrapper[4774]: E1001 13:39:48.964609 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.464597531 +0000 UTC m=+161.354228208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.064167 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.064798 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.56478375 +0000 UTC m=+161.454414347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.156863 4774 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.166087 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.166378 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.666366859 +0000 UTC m=+161.555997446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.266925 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.267178 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.767142605 +0000 UTC m=+161.656773222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.267385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.267749 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.767738092 +0000 UTC m=+161.657368699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.328392 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-xfhs8" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.368341 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.368644 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.868628771 +0000 UTC m=+161.758259368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.372685 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.372875 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.385491 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:49 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:49 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:49 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.385752 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.469706 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.470529 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:49.970515918 +0000 UTC m=+161.860146515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.571959 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.572436 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.072420875 +0000 UTC m=+161.962051472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.673988 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.674332 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.174321153 +0000 UTC m=+162.063951740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.715571 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.716579 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.722045 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.728206 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.774641 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.774767 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.274753029 +0000 UTC m=+162.164383616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.774830 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.774882 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.774907 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d4rm\" (UniqueName: \"kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.774923 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.775184 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.275176151 +0000 UTC m=+162.164806748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.816953 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.825351 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" event={"ID":"c6ea90f2-b0dc-4809-a02d-c44eda1431c2","Type":"ContainerStarted","Data":"905108396a0d634ce75670f5e418831ae3cceddd6e2557d8cb7fe3667f52cab6"} Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.825388 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" event={"ID":"c6ea90f2-b0dc-4809-a02d-c44eda1431c2","Type":"ContainerStarted","Data":"0cdc1c41baaa7211c04b0c2013aa3098cc8e22bfee9571859a9f2ccf559db094"} Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.834486 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnt9h" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.853934 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rgdm8" podStartSLOduration=9.853918131 podStartE2EDuration="9.853918131s" podCreationTimestamp="2025-10-01 13:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:49.852348727 +0000 UTC m=+161.741979334" watchObservedRunningTime="2025-10-01 13:39:49.853918131 +0000 UTC m=+161.743548728" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.875798 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.876146 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d4rm\" (UniqueName: \"kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.876169 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.876249 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.876915 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.376892773 +0000 UTC m=+162.266523370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.876962 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.877296 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.899235 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d4rm\" (UniqueName: \"kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm\") pod \"certified-operators-rjxls\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.917116 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.917988 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.919695 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.937390 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.977664 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ddcf\" (UniqueName: \"kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.978112 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.978240 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:49 crc kubenswrapper[4774]: I1001 13:39:49.978378 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:49 crc kubenswrapper[4774]: E1001 13:39:49.984681 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:39:50.484666765 +0000 UTC m=+162.374297352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xcl4x" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.035961 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.073711 4774 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T13:39:49.156887304Z","Handler":null,"Name":""} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.077275 4774 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.077361 4774 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.082426 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.089127 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ddcf\" (UniqueName: \"kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.089314 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.089397 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.089856 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.090068 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.116476 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.117718 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.116490 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ddcf\" (UniqueName: \"kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf\") pod \"community-operators-7fkzd\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.129709 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.138666 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.190289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.190333 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.190356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.190397 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6cz2\" (UniqueName: \"kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.193933 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.193979 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.230356 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.232576 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xcl4x\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.278707 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.292524 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6cz2\" (UniqueName: \"kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.292619 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.292645 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.293220 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.296577 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.316234 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.317237 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.318790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6cz2\" (UniqueName: \"kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2\") pod \"certified-operators-fj4dc\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.324489 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.378603 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:50 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:50 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:50 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.378657 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.394191 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.394252 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.394282 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bjp\" (UniqueName: \"kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.438687 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:39:50 crc kubenswrapper[4774]: W1001 13:39:50.444780 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda572962b_dbdb_4313_ada5_20c8eddfed11.slice/crio-c53780b4c63b004bd134cdd1db99595dfe6a19e742700a8d1987c20b442e86b1 WatchSource:0}: Error finding container c53780b4c63b004bd134cdd1db99595dfe6a19e742700a8d1987c20b442e86b1: Status 404 returned error can't find the container with id c53780b4c63b004bd134cdd1db99595dfe6a19e742700a8d1987c20b442e86b1 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.481826 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.495163 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.495201 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.495225 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bjp\" (UniqueName: \"kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.496127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.496344 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.524791 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bjp\" (UniqueName: \"kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp\") pod \"community-operators-zwf85\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.530927 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.608669 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.611741 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.614086 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.614239 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.624013 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.642153 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.697198 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.697252 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.753498 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:39:50 crc kubenswrapper[4774]: W1001 13:39:50.759825 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f9c75f_2ec4_4089_88b1_0bb1ba287f16.slice/crio-fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db WatchSource:0}: Error finding container fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db: Status 404 returned error can't find the container with id fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.798224 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.798292 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.798618 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.815773 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.831747 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.833603 4774 generic.go:334] "Generic (PLEG): container finished" podID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerID="6465d5511a55940f8351014baa8c6a7ed72cb594b375d7ab753e5807886314bf" exitCode=0 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.833673 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerDied","Data":"6465d5511a55940f8351014baa8c6a7ed72cb594b375d7ab753e5807886314bf"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.833705 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerStarted","Data":"d8a51a8ff1c5ed214d3e9a65a1e55b737b3aed5292cdba59b5fae6d7eb0f7834"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.835356 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:39:50 crc kubenswrapper[4774]: W1001 13:39:50.838779 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f32cd0_a153_4829_9671_bcf21227cc13.slice/crio-bb0d8ff27c064bb37f5ea2e033963cb22752b2f1acf86feeab61b565d45d7bc7 WatchSource:0}: Error finding container bb0d8ff27c064bb37f5ea2e033963cb22752b2f1acf86feeab61b565d45d7bc7: Status 404 returned error can't find the container with id bb0d8ff27c064bb37f5ea2e033963cb22752b2f1acf86feeab61b565d45d7bc7 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.839917 4774 generic.go:334] "Generic (PLEG): container finished" podID="e6f2d4ce-a586-48bb-b821-71d8a9a988f8" containerID="3354288fa35a9773464f7399acee45b3a9513b6dd5f49f515624373bba853ab6" exitCode=0 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.840574 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" event={"ID":"e6f2d4ce-a586-48bb-b821-71d8a9a988f8","Type":"ContainerDied","Data":"3354288fa35a9773464f7399acee45b3a9513b6dd5f49f515624373bba853ab6"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.842791 4774 generic.go:334] "Generic (PLEG): container finished" podID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerID="95e384c14f902e6a12246a2e6543e0a0760e91b4fa12c800970685f65ba6c907" exitCode=0 Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.842846 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerDied","Data":"95e384c14f902e6a12246a2e6543e0a0760e91b4fa12c800970685f65ba6c907"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.842873 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerStarted","Data":"c53780b4c63b004bd134cdd1db99595dfe6a19e742700a8d1987c20b442e86b1"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.844496 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" event={"ID":"82f9c75f-2ec4-4089-88b1-0bb1ba287f16","Type":"ContainerStarted","Data":"fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db"} Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.887086 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 13:39:50 crc kubenswrapper[4774]: I1001 13:39:50.929923 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.038189 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.209433 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.380025 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:51 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:51 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:51 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.380580 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.709631 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.710546 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.713029 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.724589 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.814702 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.815043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6n9\" (UniqueName: \"kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.815082 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.852577 4774 generic.go:334] "Generic (PLEG): container finished" podID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerID="9298b183a43533eb082c6d14c8f1ca4f0db87dbcf7f77daf8aaeef8fbeb5261a" exitCode=0 Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.852662 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerDied","Data":"9298b183a43533eb082c6d14c8f1ca4f0db87dbcf7f77daf8aaeef8fbeb5261a"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.852692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerStarted","Data":"bb0d8ff27c064bb37f5ea2e033963cb22752b2f1acf86feeab61b565d45d7bc7"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.854172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"49d829f3-7369-4ca1-a2b4-d4463250496f","Type":"ContainerStarted","Data":"bf1e5ea1ae36ad6079b3f949f48a898780de4f98b3a041ef0d1a69d76f2ca553"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.854229 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"49d829f3-7369-4ca1-a2b4-d4463250496f","Type":"ContainerStarted","Data":"b7ef008b4185b7eb482ab7e4564706c8362a2758d488c37a0abe06b9aeb3a36e"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.858471 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" event={"ID":"82f9c75f-2ec4-4089-88b1-0bb1ba287f16","Type":"ContainerStarted","Data":"05c0e75da268172c12d27baef0368a761d6cdc7660bb265c79e1486c3bdd457e"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.859144 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.863330 4774 generic.go:334] "Generic (PLEG): container finished" podID="27e9ea5d-233d-4792-ab12-da809240a05c" containerID="ed421b16d26687599b3f4dc592e06a7cc643f613f376341e16a80d3b6d4f1776" exitCode=0 Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.863464 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerDied","Data":"ed421b16d26687599b3f4dc592e06a7cc643f613f376341e16a80d3b6d4f1776"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.863508 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerStarted","Data":"3b09d7ddb268ff2b7a3efb531e99fc3c221869445ae013aa31e1725f3c8055ef"} Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.910588 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.910567339 podStartE2EDuration="1.910567339s" podCreationTimestamp="2025-10-01 13:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:51.88697891 +0000 UTC m=+163.776609527" watchObservedRunningTime="2025-10-01 13:39:51.910567339 +0000 UTC m=+163.800197946" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.915915 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.915978 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.916029 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6n9\" (UniqueName: \"kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.917343 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.917354 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.936637 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" podStartSLOduration=140.936620577 podStartE2EDuration="2m20.936620577s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:51.935603089 +0000 UTC m=+163.825233706" watchObservedRunningTime="2025-10-01 13:39:51.936620577 +0000 UTC m=+163.826251174" Oct 01 13:39:51 crc kubenswrapper[4774]: I1001 13:39:51.945552 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6n9\" (UniqueName: \"kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9\") pod \"redhat-marketplace-rgjkf\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.025006 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.074336 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.075134 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.079046 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.079114 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.081106 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.111274 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.126296 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.126437 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.197531 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.205491 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.214384 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-nndfg" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.247066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.247158 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.247251 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.247288 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.247314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtqk4\" (UniqueName: \"kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.347974 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume\") pod \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348054 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume\") pod \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348114 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np74x\" (UniqueName: \"kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x\") pod \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\" (UID: \"e6f2d4ce-a586-48bb-b821-71d8a9a988f8\") " Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348370 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348422 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348541 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348583 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.348610 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtqk4\" (UniqueName: \"kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.350087 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6f2d4ce-a586-48bb-b821-71d8a9a988f8" (UID: "e6f2d4ce-a586-48bb-b821-71d8a9a988f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.353210 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.353720 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.353730 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.357671 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6f2d4ce-a586-48bb-b821-71d8a9a988f8" (UID: "e6f2d4ce-a586-48bb-b821-71d8a9a988f8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.357847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x" (OuterVolumeSpecName: "kube-api-access-np74x") pod "e6f2d4ce-a586-48bb-b821-71d8a9a988f8" (UID: "e6f2d4ce-a586-48bb-b821-71d8a9a988f8"). InnerVolumeSpecName "kube-api-access-np74x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.383029 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:52 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:52 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:52 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.383105 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.388334 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.388706 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtqk4\" (UniqueName: \"kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4\") pod \"redhat-marketplace-m62m5\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.397503 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.410913 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:52 crc kubenswrapper[4774]: W1001 13:39:52.412415 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e764fef_cdd8_40bc_88cb_cad642239efc.slice/crio-99016bc816a33089a44bd10ee21a08ab1438eb6a16d8e06f3f1acbf578d9b1b6 WatchSource:0}: Error finding container 99016bc816a33089a44bd10ee21a08ab1438eb6a16d8e06f3f1acbf578d9b1b6: Status 404 returned error can't find the container with id 99016bc816a33089a44bd10ee21a08ab1438eb6a16d8e06f3f1acbf578d9b1b6 Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.456576 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np74x\" (UniqueName: \"kubernetes.io/projected/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-kube-api-access-np74x\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.456628 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.456641 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6f2d4ce-a586-48bb-b821-71d8a9a988f8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.493472 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.660251 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.693953 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-9f8ft container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.694000 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9f8ft" podUID="1c4215dd-b2d9-4617-9bc4-43536f0a06f6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.694013 4774 patch_prober.go:28] interesting pod/downloads-7954f5f757-9f8ft container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.694078 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9f8ft" podUID="1c4215dd-b2d9-4617-9bc4-43536f0a06f6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.791782 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:39:52 crc kubenswrapper[4774]: W1001 13:39:52.844894 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5248d9d_f880_4802_ab66_c9ecf256c2b8.slice/crio-88faf3866758456d60945b5b2b5c9686630d10e379ac66c15aa95b74d499423a WatchSource:0}: Error finding container 88faf3866758456d60945b5b2b5c9686630d10e379ac66c15aa95b74d499423a: Status 404 returned error can't find the container with id 88faf3866758456d60945b5b2b5c9686630d10e379ac66c15aa95b74d499423a Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.877117 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerID="f3897e7617c265bc7a59365b62498159860ced5621a9d6d71ea444549ff5b9dd" exitCode=0 Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.906639 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerDied","Data":"f3897e7617c265bc7a59365b62498159860ced5621a9d6d71ea444549ff5b9dd"} Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.906930 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerStarted","Data":"99016bc816a33089a44bd10ee21a08ab1438eb6a16d8e06f3f1acbf578d9b1b6"} Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.909722 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" event={"ID":"e6f2d4ce-a586-48bb-b821-71d8a9a988f8","Type":"ContainerDied","Data":"9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8"} Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.909761 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b76e198836f33d10f236b39fd38a7d3222e454298cda0741d9f84fef24900a8" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.909987 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.922744 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:39:52 crc kubenswrapper[4774]: E1001 13:39:52.922940 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f2d4ce-a586-48bb-b821-71d8a9a988f8" containerName="collect-profiles" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.922951 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f2d4ce-a586-48bb-b821-71d8a9a988f8" containerName="collect-profiles" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.923055 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f2d4ce-a586-48bb-b821-71d8a9a988f8" containerName="collect-profiles" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.923758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.925857 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.939615 4774 generic.go:334] "Generic (PLEG): container finished" podID="49d829f3-7369-4ca1-a2b4-d4463250496f" containerID="bf1e5ea1ae36ad6079b3f949f48a898780de4f98b3a041ef0d1a69d76f2ca553" exitCode=0 Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.939772 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"49d829f3-7369-4ca1-a2b4-d4463250496f","Type":"ContainerDied","Data":"bf1e5ea1ae36ad6079b3f949f48a898780de4f98b3a041ef0d1a69d76f2ca553"} Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.941349 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.947018 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerStarted","Data":"88faf3866758456d60945b5b2b5c9686630d10e379ac66c15aa95b74d499423a"} Oct 01 13:39:52 crc kubenswrapper[4774]: I1001 13:39:52.952433 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2","Type":"ContainerStarted","Data":"d9a3b58594459fa01ef94bde292cd3cd547e9fa637e6995e352a2f7334a8f8cd"} Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.071219 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.071645 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.072253 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlsdx\" (UniqueName: \"kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.173350 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.173420 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.173469 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlsdx\" (UniqueName: \"kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.174091 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.174297 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.191860 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlsdx\" (UniqueName: \"kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx\") pod \"redhat-operators-k2p9p\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.315331 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.316274 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.317338 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.323526 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.377105 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.380070 4774 patch_prober.go:28] interesting pod/router-default-5444994796-8gh62 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:39:53 crc kubenswrapper[4774]: [-]has-synced failed: reason withheld Oct 01 13:39:53 crc kubenswrapper[4774]: [+]process-running ok Oct 01 13:39:53 crc kubenswrapper[4774]: healthz check failed Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.380129 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8gh62" podUID="ed9c8df6-bd8f-4e4b-a4fd-ac0d26cf2823" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.488914 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.488990 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpgsr\" (UniqueName: \"kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.489015 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.591718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.591762 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpgsr\" (UniqueName: \"kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.591890 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.592368 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.592515 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.612544 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpgsr\" (UniqueName: \"kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr\") pod \"redhat-operators-jmh9n\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.692141 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.726385 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.726464 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.729576 4774 patch_prober.go:28] interesting pod/console-f9d7485db-r7bcf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.33:8443/health\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.729618 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-r7bcf" podUID="16564c2c-07e2-4d6e-9f35-fd14654d1538" containerName="console" probeResult="failure" output="Get \"https://10.217.0.33:8443/health\": dial tcp 10.217.0.33:8443: connect: connection refused" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.795897 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:39:53 crc kubenswrapper[4774]: W1001 13:39:53.848473 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0930b18d_4424_4085_917f_114bb3efe343.slice/crio-e70b26636cbcd58b0784a1125d09382b62ed7bfd7cdac31258d3bcf0cf2cd7b6 WatchSource:0}: Error finding container e70b26636cbcd58b0784a1125d09382b62ed7bfd7cdac31258d3bcf0cf2cd7b6: Status 404 returned error can't find the container with id e70b26636cbcd58b0784a1125d09382b62ed7bfd7cdac31258d3bcf0cf2cd7b6 Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.902323 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.909067 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/67555194-dc73-4f0a-bd6e-1ae0a010067a-metrics-certs\") pod \"network-metrics-daemon-hgfsz\" (UID: \"67555194-dc73-4f0a-bd6e-1ae0a010067a\") " pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.920097 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:39:53 crc kubenswrapper[4774]: W1001 13:39:53.931873 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc70b8cc_0865_43ad_b407_0205067634dc.slice/crio-dd7ddf25a7cd26e66ffdda33be3d0aa9764e68efe10b7520d80b277a6a80ce5d WatchSource:0}: Error finding container dd7ddf25a7cd26e66ffdda33be3d0aa9764e68efe10b7520d80b277a6a80ce5d: Status 404 returned error can't find the container with id dd7ddf25a7cd26e66ffdda33be3d0aa9764e68efe10b7520d80b277a6a80ce5d Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.958764 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerStarted","Data":"e70b26636cbcd58b0784a1125d09382b62ed7bfd7cdac31258d3bcf0cf2cd7b6"} Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.959804 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerStarted","Data":"dd7ddf25a7cd26e66ffdda33be3d0aa9764e68efe10b7520d80b277a6a80ce5d"} Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.961728 4774 generic.go:334] "Generic (PLEG): container finished" podID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerID="bbbf3c7026cda2ea2fb5d8bc9f90b2b7a8db88d26bd4527bbc20a89a1e73f9c1" exitCode=0 Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.961778 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerDied","Data":"bbbf3c7026cda2ea2fb5d8bc9f90b2b7a8db88d26bd4527bbc20a89a1e73f9c1"} Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.970211 4774 generic.go:334] "Generic (PLEG): container finished" podID="43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" containerID="80349861a2feb30efcca59f5c5024afcf14f88bc0e3c5ed9d1c9e7babd5bb3eb" exitCode=0 Oct 01 13:39:53 crc kubenswrapper[4774]: I1001 13:39:53.970471 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2","Type":"ContainerDied","Data":"80349861a2feb30efcca59f5c5024afcf14f88bc0e3c5ed9d1c9e7babd5bb3eb"} Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.102052 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hgfsz" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.252676 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.379484 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.381917 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8gh62" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.411344 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir\") pod \"49d829f3-7369-4ca1-a2b4-d4463250496f\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.411465 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access\") pod \"49d829f3-7369-4ca1-a2b4-d4463250496f\" (UID: \"49d829f3-7369-4ca1-a2b4-d4463250496f\") " Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.412168 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49d829f3-7369-4ca1-a2b4-d4463250496f" (UID: "49d829f3-7369-4ca1-a2b4-d4463250496f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.420970 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49d829f3-7369-4ca1-a2b4-d4463250496f" (UID: "49d829f3-7369-4ca1-a2b4-d4463250496f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.513265 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49d829f3-7369-4ca1-a2b4-d4463250496f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.513437 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49d829f3-7369-4ca1-a2b4-d4463250496f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.703806 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hgfsz"] Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.991313 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"49d829f3-7369-4ca1-a2b4-d4463250496f","Type":"ContainerDied","Data":"b7ef008b4185b7eb482ab7e4564706c8362a2758d488c37a0abe06b9aeb3a36e"} Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.991370 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7ef008b4185b7eb482ab7e4564706c8362a2758d488c37a0abe06b9aeb3a36e" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.991426 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:39:54 crc kubenswrapper[4774]: I1001 13:39:54.997293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" event={"ID":"67555194-dc73-4f0a-bd6e-1ae0a010067a","Type":"ContainerStarted","Data":"1246eaec7a5527440d2ab80a5feb430e6b236776e1689a22a2aedc80bd482300"} Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.002011 4774 generic.go:334] "Generic (PLEG): container finished" podID="0930b18d-4424-4085-917f-114bb3efe343" containerID="3fe85745d27ca09ea26a876f54e2af4748773b5df9a07e1ac0c5c70967208343" exitCode=0 Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.002058 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerDied","Data":"3fe85745d27ca09ea26a876f54e2af4748773b5df9a07e1ac0c5c70967208343"} Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.016203 4774 generic.go:334] "Generic (PLEG): container finished" podID="fc70b8cc-0865-43ad-b407-0205067634dc" containerID="6df3d893038350f0779aa7726ab40293e257b18369524589f9cf7363e936199d" exitCode=0 Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.016484 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerDied","Data":"6df3d893038350f0779aa7726ab40293e257b18369524589f9cf7363e936199d"} Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.290997 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.344207 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir\") pod \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.344285 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access\") pod \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\" (UID: \"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2\") " Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.345085 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" (UID: "43108b2a-ea14-4eff-a9ed-e9600f1d9ae2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.351066 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" (UID: "43108b2a-ea14-4eff-a9ed-e9600f1d9ae2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.445050 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:55 crc kubenswrapper[4774]: I1001 13:39:55.445078 4774 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43108b2a-ea14-4eff-a9ed-e9600f1d9ae2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:56 crc kubenswrapper[4774]: I1001 13:39:56.027589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" event={"ID":"67555194-dc73-4f0a-bd6e-1ae0a010067a","Type":"ContainerStarted","Data":"afafe17f85f1ff6895030d322ea04a72a93c59a15d9460de9de821dc1dad1821"} Oct 01 13:39:56 crc kubenswrapper[4774]: I1001 13:39:56.030318 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"43108b2a-ea14-4eff-a9ed-e9600f1d9ae2","Type":"ContainerDied","Data":"d9a3b58594459fa01ef94bde292cd3cd547e9fa637e6995e352a2f7334a8f8cd"} Oct 01 13:39:56 crc kubenswrapper[4774]: I1001 13:39:56.030349 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a3b58594459fa01ef94bde292cd3cd547e9fa637e6995e352a2f7334a8f8cd" Oct 01 13:39:56 crc kubenswrapper[4774]: I1001 13:39:56.030411 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:39:57 crc kubenswrapper[4774]: I1001 13:39:57.046897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hgfsz" event={"ID":"67555194-dc73-4f0a-bd6e-1ae0a010067a","Type":"ContainerStarted","Data":"6bb1588439e10ab4a7b9c0b1d04d0365a483d8d25119b919fae8be225bbd17be"} Oct 01 13:39:57 crc kubenswrapper[4774]: I1001 13:39:57.071007 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hgfsz" podStartSLOduration=146.070991955 podStartE2EDuration="2m26.070991955s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:39:57.068079594 +0000 UTC m=+168.957710211" watchObservedRunningTime="2025-10-01 13:39:57.070991955 +0000 UTC m=+168.960622552" Oct 01 13:39:58 crc kubenswrapper[4774]: I1001 13:39:58.811703 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kk6dj" Oct 01 13:40:02 crc kubenswrapper[4774]: I1001 13:40:02.713426 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9f8ft" Oct 01 13:40:03 crc kubenswrapper[4774]: I1001 13:40:03.734740 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:40:03 crc kubenswrapper[4774]: I1001 13:40:03.738847 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-r7bcf" Oct 01 13:40:07 crc kubenswrapper[4774]: I1001 13:40:07.271340 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:40:07 crc kubenswrapper[4774]: I1001 13:40:07.271849 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:40:08 crc kubenswrapper[4774]: I1001 13:40:08.070958 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:40:10 crc kubenswrapper[4774]: I1001 13:40:10.537256 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:40:15 crc kubenswrapper[4774]: E1001 13:40:15.005030 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 13:40:15 crc kubenswrapper[4774]: E1001 13:40:15.005712 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wtqk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-m62m5_openshift-marketplace(e5248d9d-f880-4802-ab66-c9ecf256c2b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:40:15 crc kubenswrapper[4774]: E1001 13:40:15.007063 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-m62m5" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" Oct 01 13:40:16 crc kubenswrapper[4774]: E1001 13:40:16.870714 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-m62m5" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" Oct 01 13:40:18 crc kubenswrapper[4774]: E1001 13:40:18.762088 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 13:40:18 crc kubenswrapper[4774]: E1001 13:40:18.762340 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6cz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-fj4dc_openshift-marketplace(27e9ea5d-233d-4792-ab12-da809240a05c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:40:18 crc kubenswrapper[4774]: E1001 13:40:18.763606 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-fj4dc" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" Oct 01 13:40:20 crc kubenswrapper[4774]: E1001 13:40:20.695155 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-fj4dc" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" Oct 01 13:40:20 crc kubenswrapper[4774]: E1001 13:40:20.702523 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 13:40:20 crc kubenswrapper[4774]: E1001 13:40:20.702876 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2d4rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rjxls_openshift-marketplace(d47e8bfd-f0b4-494b-9125-7b0dcf336ff8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:40:20 crc kubenswrapper[4774]: E1001 13:40:20.705003 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rjxls" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.095176 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.095400 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5ddcf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7fkzd_openshift-marketplace(a572962b-dbdb-4313-ada5-20c8eddfed11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.098596 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7fkzd" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.555536 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7fkzd" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.555536 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rjxls" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.644668 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.645125 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5bjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zwf85_openshift-marketplace(f0f32cd0-a153-4829-9671-bcf21227cc13): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.646283 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zwf85" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.650684 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.650826 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zpgsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jmh9n_openshift-marketplace(fc70b8cc-0865-43ad-b407-0205067634dc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" logger="UnhandledError" Oct 01 13:40:21 crc kubenswrapper[4774]: E1001 13:40:21.652057 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-jmh9n" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" Oct 01 13:40:22 crc kubenswrapper[4774]: I1001 13:40:22.787811 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-gd5zm" Oct 01 13:40:22 crc kubenswrapper[4774]: I1001 13:40:22.938980 4774 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","pode6f2d4ce-a586-48bb-b821-71d8a9a988f8"] err="unable to destroy cgroup paths for cgroup [kubepods burstable pode6f2d4ce-a586-48bb-b821-71d8a9a988f8] : Timed out while waiting for systemd to remove kubepods-burstable-pode6f2d4ce_a586_48bb_b821_71d8a9a988f8.slice" Oct 01 13:40:23 crc kubenswrapper[4774]: E1001 13:40:23.483872 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jmh9n" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" Oct 01 13:40:23 crc kubenswrapper[4774]: E1001 13:40:23.483889 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zwf85" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" Oct 01 13:40:23 crc kubenswrapper[4774]: E1001 13:40:23.521505 4774 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:40:23 crc kubenswrapper[4774]: E1001 13:40:23.521644 4774 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mlsdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-k2p9p_openshift-marketplace(0930b18d-4424-4085-917f-114bb3efe343): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23\": context canceled" logger="UnhandledError" Oct 01 13:40:23 crc kubenswrapper[4774]: E1001 13:40:23.522806 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:61a409c24d11b54f24f090d55318134fa26a3ff9568ff4c72edd9a417875cf23\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-k2p9p" podUID="0930b18d-4424-4085-917f-114bb3efe343" Oct 01 13:40:24 crc kubenswrapper[4774]: I1001 13:40:24.255142 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerID="0daa6951dc8600bdf129ea7c0059ec137b09468ef092d65eb8e5e07d118ae776" exitCode=0 Oct 01 13:40:24 crc kubenswrapper[4774]: I1001 13:40:24.255231 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerDied","Data":"0daa6951dc8600bdf129ea7c0059ec137b09468ef092d65eb8e5e07d118ae776"} Oct 01 13:40:24 crc kubenswrapper[4774]: E1001 13:40:24.258012 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-k2p9p" podUID="0930b18d-4424-4085-917f-114bb3efe343" Oct 01 13:40:25 crc kubenswrapper[4774]: I1001 13:40:25.267276 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerStarted","Data":"27f7d9a09745e4820f9afaa995d2cfccd7f51adbc22c616cd08d30d4d0438ba3"} Oct 01 13:40:25 crc kubenswrapper[4774]: I1001 13:40:25.295645 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rgjkf" podStartSLOduration=2.533046475 podStartE2EDuration="34.29562147s" podCreationTimestamp="2025-10-01 13:39:51 +0000 UTC" firstStartedPulling="2025-10-01 13:39:52.906471677 +0000 UTC m=+164.796102274" lastFinishedPulling="2025-10-01 13:40:24.669046672 +0000 UTC m=+196.558677269" observedRunningTime="2025-10-01 13:40:25.293290195 +0000 UTC m=+197.182920832" watchObservedRunningTime="2025-10-01 13:40:25.29562147 +0000 UTC m=+197.185252097" Oct 01 13:40:32 crc kubenswrapper[4774]: I1001 13:40:32.026421 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:40:32 crc kubenswrapper[4774]: I1001 13:40:32.027121 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:40:32 crc kubenswrapper[4774]: I1001 13:40:32.220563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:40:32 crc kubenswrapper[4774]: I1001 13:40:32.359139 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:40:34 crc kubenswrapper[4774]: I1001 13:40:34.326113 4774 generic.go:334] "Generic (PLEG): container finished" podID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerID="979472644c5aa28538b5bbd0014643f8c7e5f50933da1458307c967409bfb22a" exitCode=0 Oct 01 13:40:34 crc kubenswrapper[4774]: I1001 13:40:34.326208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerDied","Data":"979472644c5aa28538b5bbd0014643f8c7e5f50933da1458307c967409bfb22a"} Oct 01 13:40:34 crc kubenswrapper[4774]: I1001 13:40:34.333971 4774 generic.go:334] "Generic (PLEG): container finished" podID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerID="ab3b4d9f48a6566e853cfa05866b5c0183e01f3f246208f5ff7fcf530975c123" exitCode=0 Oct 01 13:40:34 crc kubenswrapper[4774]: I1001 13:40:34.334101 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerDied","Data":"ab3b4d9f48a6566e853cfa05866b5c0183e01f3f246208f5ff7fcf530975c123"} Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.343499 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerStarted","Data":"58523b0d8539685f41710699ed361f42610a560bef5f23c7790e2fdf340adbd9"} Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.349517 4774 generic.go:334] "Generic (PLEG): container finished" podID="27e9ea5d-233d-4792-ab12-da809240a05c" containerID="2a7625baabddd6c38de202c83301df156972764d848f55083ba98230765aa291" exitCode=0 Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.349637 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerDied","Data":"2a7625baabddd6c38de202c83301df156972764d848f55083ba98230765aa291"} Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.353034 4774 generic.go:334] "Generic (PLEG): container finished" podID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerID="8e1aa82ef3bad8d418716d4af17ba9450b644f338b7ade9d36439b2bbdde681a" exitCode=0 Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.353094 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerDied","Data":"8e1aa82ef3bad8d418716d4af17ba9450b644f338b7ade9d36439b2bbdde681a"} Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.360746 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerStarted","Data":"196891fa768b6a06a3173d05a1b08461248cce1e6e380d984532e68ce50973c0"} Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.364870 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7fkzd" podStartSLOduration=2.15906102 podStartE2EDuration="46.364846241s" podCreationTimestamp="2025-10-01 13:39:49 +0000 UTC" firstStartedPulling="2025-10-01 13:39:50.846057604 +0000 UTC m=+162.735688201" lastFinishedPulling="2025-10-01 13:40:35.051842815 +0000 UTC m=+206.941473422" observedRunningTime="2025-10-01 13:40:35.361668331 +0000 UTC m=+207.251298968" watchObservedRunningTime="2025-10-01 13:40:35.364846241 +0000 UTC m=+207.254476868" Oct 01 13:40:35 crc kubenswrapper[4774]: I1001 13:40:35.464272 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m62m5" podStartSLOduration=2.611712202 podStartE2EDuration="43.464255684s" podCreationTimestamp="2025-10-01 13:39:52 +0000 UTC" firstStartedPulling="2025-10-01 13:39:53.963286768 +0000 UTC m=+165.852917365" lastFinishedPulling="2025-10-01 13:40:34.81583023 +0000 UTC m=+206.705460847" observedRunningTime="2025-10-01 13:40:35.461884017 +0000 UTC m=+207.351514624" watchObservedRunningTime="2025-10-01 13:40:35.464255684 +0000 UTC m=+207.353886291" Oct 01 13:40:36 crc kubenswrapper[4774]: I1001 13:40:36.368893 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerStarted","Data":"8a1bd5f0b2a6469bc430efbf258c186328bfbdec60c05eed4fba80e06040d7ab"} Oct 01 13:40:36 crc kubenswrapper[4774]: I1001 13:40:36.371659 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerStarted","Data":"8a9866e14efaa3f0d8ba1710f21fde8a6639ef8fdf18fa7dd318e52f7ca3c900"} Oct 01 13:40:36 crc kubenswrapper[4774]: I1001 13:40:36.393221 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fj4dc" podStartSLOduration=2.204796328 podStartE2EDuration="46.393197159s" podCreationTimestamp="2025-10-01 13:39:50 +0000 UTC" firstStartedPulling="2025-10-01 13:39:51.86478942 +0000 UTC m=+163.754420017" lastFinishedPulling="2025-10-01 13:40:36.053190251 +0000 UTC m=+207.942820848" observedRunningTime="2025-10-01 13:40:36.388467885 +0000 UTC m=+208.278098492" watchObservedRunningTime="2025-10-01 13:40:36.393197159 +0000 UTC m=+208.282827796" Oct 01 13:40:36 crc kubenswrapper[4774]: I1001 13:40:36.407091 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwf85" podStartSLOduration=2.39984902 podStartE2EDuration="46.40706909s" podCreationTimestamp="2025-10-01 13:39:50 +0000 UTC" firstStartedPulling="2025-10-01 13:39:51.854856872 +0000 UTC m=+163.744487469" lastFinishedPulling="2025-10-01 13:40:35.862076942 +0000 UTC m=+207.751707539" observedRunningTime="2025-10-01 13:40:36.405388262 +0000 UTC m=+208.295018879" watchObservedRunningTime="2025-10-01 13:40:36.40706909 +0000 UTC m=+208.296699717" Oct 01 13:40:37 crc kubenswrapper[4774]: I1001 13:40:37.271302 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:40:37 crc kubenswrapper[4774]: I1001 13:40:37.271603 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:40:37 crc kubenswrapper[4774]: I1001 13:40:37.271652 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:40:37 crc kubenswrapper[4774]: I1001 13:40:37.272221 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:40:37 crc kubenswrapper[4774]: I1001 13:40:37.272332 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38" gracePeriod=600 Oct 01 13:40:38 crc kubenswrapper[4774]: I1001 13:40:38.396302 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38" exitCode=0 Oct 01 13:40:38 crc kubenswrapper[4774]: I1001 13:40:38.396387 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38"} Oct 01 13:40:38 crc kubenswrapper[4774]: I1001 13:40:38.396884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db"} Oct 01 13:40:38 crc kubenswrapper[4774]: I1001 13:40:38.399695 4774 generic.go:334] "Generic (PLEG): container finished" podID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerID="37af91829c195307452de213df4229b6410c6c1350935ebf65b03f043402eb59" exitCode=0 Oct 01 13:40:38 crc kubenswrapper[4774]: I1001 13:40:38.399727 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerDied","Data":"37af91829c195307452de213df4229b6410c6c1350935ebf65b03f043402eb59"} Oct 01 13:40:39 crc kubenswrapper[4774]: I1001 13:40:39.409901 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerStarted","Data":"9b25f05ddc617a644d6b30aa332532ac2228225fc6b97d821fadc648b4045a21"} Oct 01 13:40:39 crc kubenswrapper[4774]: I1001 13:40:39.431740 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjxls" podStartSLOduration=2.435021645 podStartE2EDuration="50.431724828s" podCreationTimestamp="2025-10-01 13:39:49 +0000 UTC" firstStartedPulling="2025-10-01 13:39:50.835121649 +0000 UTC m=+162.724752246" lastFinishedPulling="2025-10-01 13:40:38.831824842 +0000 UTC m=+210.721455429" observedRunningTime="2025-10-01 13:40:39.428927329 +0000 UTC m=+211.318557946" watchObservedRunningTime="2025-10-01 13:40:39.431724828 +0000 UTC m=+211.321355425" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.036730 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.036850 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.231729 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.232031 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.285722 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.464035 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.486123 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.486563 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.536199 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.643334 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.643465 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:40 crc kubenswrapper[4774]: I1001 13:40:40.693534 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:41 crc kubenswrapper[4774]: I1001 13:40:41.083289 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rjxls" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="registry-server" probeResult="failure" output=< Oct 01 13:40:41 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 01 13:40:41 crc kubenswrapper[4774]: > Oct 01 13:40:41 crc kubenswrapper[4774]: I1001 13:40:41.471612 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:41 crc kubenswrapper[4774]: I1001 13:40:41.471971 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:42 crc kubenswrapper[4774]: I1001 13:40:42.120967 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:40:42 crc kubenswrapper[4774]: I1001 13:40:42.323508 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:40:42 crc kubenswrapper[4774]: I1001 13:40:42.494744 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:42 crc kubenswrapper[4774]: I1001 13:40:42.494844 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:42 crc kubenswrapper[4774]: I1001 13:40:42.534829 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:43 crc kubenswrapper[4774]: I1001 13:40:43.435580 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwf85" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="registry-server" containerID="cri-o://8a9866e14efaa3f0d8ba1710f21fde8a6639ef8fdf18fa7dd318e52f7ca3c900" gracePeriod=2 Oct 01 13:40:43 crc kubenswrapper[4774]: I1001 13:40:43.435712 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fj4dc" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="registry-server" containerID="cri-o://8a1bd5f0b2a6469bc430efbf258c186328bfbdec60c05eed4fba80e06040d7ab" gracePeriod=2 Oct 01 13:40:43 crc kubenswrapper[4774]: I1001 13:40:43.485866 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:44 crc kubenswrapper[4774]: I1001 13:40:44.448276 4774 generic.go:334] "Generic (PLEG): container finished" podID="27e9ea5d-233d-4792-ab12-da809240a05c" containerID="8a1bd5f0b2a6469bc430efbf258c186328bfbdec60c05eed4fba80e06040d7ab" exitCode=0 Oct 01 13:40:44 crc kubenswrapper[4774]: I1001 13:40:44.448355 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerDied","Data":"8a1bd5f0b2a6469bc430efbf258c186328bfbdec60c05eed4fba80e06040d7ab"} Oct 01 13:40:44 crc kubenswrapper[4774]: I1001 13:40:44.450989 4774 generic.go:334] "Generic (PLEG): container finished" podID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerID="8a9866e14efaa3f0d8ba1710f21fde8a6639ef8fdf18fa7dd318e52f7ca3c900" exitCode=0 Oct 01 13:40:44 crc kubenswrapper[4774]: I1001 13:40:44.451082 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerDied","Data":"8a9866e14efaa3f0d8ba1710f21fde8a6639ef8fdf18fa7dd318e52f7ca3c900"} Oct 01 13:40:46 crc kubenswrapper[4774]: I1001 13:40:46.917793 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:40:46 crc kubenswrapper[4774]: I1001 13:40:46.918389 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m62m5" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="registry-server" containerID="cri-o://196891fa768b6a06a3173d05a1b08461248cce1e6e380d984532e68ce50973c0" gracePeriod=2 Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.313109 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.399244 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content\") pod \"27e9ea5d-233d-4792-ab12-da809240a05c\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.399310 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities\") pod \"27e9ea5d-233d-4792-ab12-da809240a05c\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.399338 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6cz2\" (UniqueName: \"kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2\") pod \"27e9ea5d-233d-4792-ab12-da809240a05c\" (UID: \"27e9ea5d-233d-4792-ab12-da809240a05c\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.401234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities" (OuterVolumeSpecName: "utilities") pod "27e9ea5d-233d-4792-ab12-da809240a05c" (UID: "27e9ea5d-233d-4792-ab12-da809240a05c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.408712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2" (OuterVolumeSpecName: "kube-api-access-m6cz2") pod "27e9ea5d-233d-4792-ab12-da809240a05c" (UID: "27e9ea5d-233d-4792-ab12-da809240a05c"). InnerVolumeSpecName "kube-api-access-m6cz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.478917 4774 generic.go:334] "Generic (PLEG): container finished" podID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerID="196891fa768b6a06a3173d05a1b08461248cce1e6e380d984532e68ce50973c0" exitCode=0 Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.479185 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerDied","Data":"196891fa768b6a06a3173d05a1b08461248cce1e6e380d984532e68ce50973c0"} Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.485159 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27e9ea5d-233d-4792-ab12-da809240a05c" (UID: "27e9ea5d-233d-4792-ab12-da809240a05c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.491747 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fj4dc" event={"ID":"27e9ea5d-233d-4792-ab12-da809240a05c","Type":"ContainerDied","Data":"3b09d7ddb268ff2b7a3efb531e99fc3c221869445ae013aa31e1725f3c8055ef"} Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.491811 4774 scope.go:117] "RemoveContainer" containerID="8a1bd5f0b2a6469bc430efbf258c186328bfbdec60c05eed4fba80e06040d7ab" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.491857 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fj4dc" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.500706 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.500784 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e9ea5d-233d-4792-ab12-da809240a05c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.500798 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6cz2\" (UniqueName: \"kubernetes.io/projected/27e9ea5d-233d-4792-ab12-da809240a05c-kube-api-access-m6cz2\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.508765 4774 scope.go:117] "RemoveContainer" containerID="2a7625baabddd6c38de202c83301df156972764d848f55083ba98230765aa291" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.524040 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.530726 4774 scope.go:117] "RemoveContainer" containerID="ed421b16d26687599b3f4dc592e06a7cc643f613f376341e16a80d3b6d4f1776" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.541315 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.549907 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.552134 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fj4dc"] Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.601976 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtqk4\" (UniqueName: \"kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4\") pod \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.602042 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content\") pod \"f0f32cd0-a153-4829-9671-bcf21227cc13\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.602076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities\") pod \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.602111 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content\") pod \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\" (UID: \"e5248d9d-f880-4802-ab66-c9ecf256c2b8\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.602136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities\") pod \"f0f32cd0-a153-4829-9671-bcf21227cc13\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.602155 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bjp\" (UniqueName: \"kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp\") pod \"f0f32cd0-a153-4829-9671-bcf21227cc13\" (UID: \"f0f32cd0-a153-4829-9671-bcf21227cc13\") " Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.605263 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities" (OuterVolumeSpecName: "utilities") pod "f0f32cd0-a153-4829-9671-bcf21227cc13" (UID: "f0f32cd0-a153-4829-9671-bcf21227cc13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.607946 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp" (OuterVolumeSpecName: "kube-api-access-v5bjp") pod "f0f32cd0-a153-4829-9671-bcf21227cc13" (UID: "f0f32cd0-a153-4829-9671-bcf21227cc13"). InnerVolumeSpecName "kube-api-access-v5bjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.608680 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities" (OuterVolumeSpecName: "utilities") pod "e5248d9d-f880-4802-ab66-c9ecf256c2b8" (UID: "e5248d9d-f880-4802-ab66-c9ecf256c2b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.609705 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4" (OuterVolumeSpecName: "kube-api-access-wtqk4") pod "e5248d9d-f880-4802-ab66-c9ecf256c2b8" (UID: "e5248d9d-f880-4802-ab66-c9ecf256c2b8"). InnerVolumeSpecName "kube-api-access-wtqk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.632586 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5248d9d-f880-4802-ab66-c9ecf256c2b8" (UID: "e5248d9d-f880-4802-ab66-c9ecf256c2b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.659857 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0f32cd0-a153-4829-9671-bcf21227cc13" (UID: "f0f32cd0-a153-4829-9671-bcf21227cc13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.703686 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.703983 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.704084 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bjp\" (UniqueName: \"kubernetes.io/projected/f0f32cd0-a153-4829-9671-bcf21227cc13-kube-api-access-v5bjp\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.704159 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtqk4\" (UniqueName: \"kubernetes.io/projected/e5248d9d-f880-4802-ab66-c9ecf256c2b8-kube-api-access-wtqk4\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.704226 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0f32cd0-a153-4829-9671-bcf21227cc13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4774]: I1001 13:40:47.704290 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5248d9d-f880-4802-ab66-c9ecf256c2b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.497822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m62m5" event={"ID":"e5248d9d-f880-4802-ab66-c9ecf256c2b8","Type":"ContainerDied","Data":"88faf3866758456d60945b5b2b5c9686630d10e379ac66c15aa95b74d499423a"} Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.498199 4774 scope.go:117] "RemoveContainer" containerID="196891fa768b6a06a3173d05a1b08461248cce1e6e380d984532e68ce50973c0" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.497847 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m62m5" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.502193 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwf85" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.502196 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwf85" event={"ID":"f0f32cd0-a153-4829-9671-bcf21227cc13","Type":"ContainerDied","Data":"bb0d8ff27c064bb37f5ea2e033963cb22752b2f1acf86feeab61b565d45d7bc7"} Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.505986 4774 generic.go:334] "Generic (PLEG): container finished" podID="0930b18d-4424-4085-917f-114bb3efe343" containerID="fba3b6f0437556dfa716964ef321b064e983f89f888247bdc2b83c5fa20fc487" exitCode=0 Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.506067 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerDied","Data":"fba3b6f0437556dfa716964ef321b064e983f89f888247bdc2b83c5fa20fc487"} Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.510390 4774 generic.go:334] "Generic (PLEG): container finished" podID="fc70b8cc-0865-43ad-b407-0205067634dc" containerID="77f2e6e9574a4391d9978d23eee14eb3b7f692c044d940b9e3474b06d55680a0" exitCode=0 Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.510496 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerDied","Data":"77f2e6e9574a4391d9978d23eee14eb3b7f692c044d940b9e3474b06d55680a0"} Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.516582 4774 scope.go:117] "RemoveContainer" containerID="ab3b4d9f48a6566e853cfa05866b5c0183e01f3f246208f5ff7fcf530975c123" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.535499 4774 scope.go:117] "RemoveContainer" containerID="bbbf3c7026cda2ea2fb5d8bc9f90b2b7a8db88d26bd4527bbc20a89a1e73f9c1" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.554900 4774 scope.go:117] "RemoveContainer" containerID="8a9866e14efaa3f0d8ba1710f21fde8a6639ef8fdf18fa7dd318e52f7ca3c900" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.566440 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.573339 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwf85"] Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.579222 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.581165 4774 scope.go:117] "RemoveContainer" containerID="8e1aa82ef3bad8d418716d4af17ba9450b644f338b7ade9d36439b2bbdde681a" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.581895 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m62m5"] Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.595331 4774 scope.go:117] "RemoveContainer" containerID="9298b183a43533eb082c6d14c8f1ca4f0db87dbcf7f77daf8aaeef8fbeb5261a" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.877904 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" path="/var/lib/kubelet/pods/27e9ea5d-233d-4792-ab12-da809240a05c/volumes" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.879319 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" path="/var/lib/kubelet/pods/e5248d9d-f880-4802-ab66-c9ecf256c2b8/volumes" Oct 01 13:40:48 crc kubenswrapper[4774]: I1001 13:40:48.880206 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" path="/var/lib/kubelet/pods/f0f32cd0-a153-4829-9671-bcf21227cc13/volumes" Oct 01 13:40:49 crc kubenswrapper[4774]: I1001 13:40:49.516300 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerStarted","Data":"28dc34d81da3cde75e0da6e3b29f6cc4d7c43466ba10cbdb65795ecd00aa1ff2"} Oct 01 13:40:49 crc kubenswrapper[4774]: I1001 13:40:49.521315 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerStarted","Data":"36663b761e0aa4d76e41ac26f45b632cf91e8e3622932e14e7ae21b0ec113b1e"} Oct 01 13:40:49 crc kubenswrapper[4774]: I1001 13:40:49.534506 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jmh9n" podStartSLOduration=2.560074665 podStartE2EDuration="56.534491873s" podCreationTimestamp="2025-10-01 13:39:53 +0000 UTC" firstStartedPulling="2025-10-01 13:39:55.018295088 +0000 UTC m=+166.907925685" lastFinishedPulling="2025-10-01 13:40:48.992712296 +0000 UTC m=+220.882342893" observedRunningTime="2025-10-01 13:40:49.53368828 +0000 UTC m=+221.423318877" watchObservedRunningTime="2025-10-01 13:40:49.534491873 +0000 UTC m=+221.424122470" Oct 01 13:40:49 crc kubenswrapper[4774]: I1001 13:40:49.558881 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2p9p" podStartSLOduration=3.6099448450000002 podStartE2EDuration="57.558839339s" podCreationTimestamp="2025-10-01 13:39:52 +0000 UTC" firstStartedPulling="2025-10-01 13:39:55.003117334 +0000 UTC m=+166.892747931" lastFinishedPulling="2025-10-01 13:40:48.952011828 +0000 UTC m=+220.841642425" observedRunningTime="2025-10-01 13:40:49.557695157 +0000 UTC m=+221.447325754" watchObservedRunningTime="2025-10-01 13:40:49.558839339 +0000 UTC m=+221.448469956" Oct 01 13:40:50 crc kubenswrapper[4774]: I1001 13:40:50.080233 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:40:50 crc kubenswrapper[4774]: I1001 13:40:50.124333 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:40:51 crc kubenswrapper[4774]: I1001 13:40:51.849039 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:40:53 crc kubenswrapper[4774]: I1001 13:40:53.315834 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:40:53 crc kubenswrapper[4774]: I1001 13:40:53.316201 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:40:53 crc kubenswrapper[4774]: I1001 13:40:53.693294 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:53 crc kubenswrapper[4774]: I1001 13:40:53.693348 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:53 crc kubenswrapper[4774]: I1001 13:40:53.732933 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:54 crc kubenswrapper[4774]: I1001 13:40:54.356252 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2p9p" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="registry-server" probeResult="failure" output=< Oct 01 13:40:54 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 01 13:40:54 crc kubenswrapper[4774]: > Oct 01 13:40:54 crc kubenswrapper[4774]: I1001 13:40:54.595246 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:55 crc kubenswrapper[4774]: I1001 13:40:55.521481 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:40:56 crc kubenswrapper[4774]: I1001 13:40:56.551592 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jmh9n" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="registry-server" containerID="cri-o://28dc34d81da3cde75e0da6e3b29f6cc4d7c43466ba10cbdb65795ecd00aa1ff2" gracePeriod=2 Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.557349 4774 generic.go:334] "Generic (PLEG): container finished" podID="fc70b8cc-0865-43ad-b407-0205067634dc" containerID="28dc34d81da3cde75e0da6e3b29f6cc4d7c43466ba10cbdb65795ecd00aa1ff2" exitCode=0 Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.557437 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerDied","Data":"28dc34d81da3cde75e0da6e3b29f6cc4d7c43466ba10cbdb65795ecd00aa1ff2"} Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.841250 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.939977 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities\") pod \"fc70b8cc-0865-43ad-b407-0205067634dc\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.940110 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content\") pod \"fc70b8cc-0865-43ad-b407-0205067634dc\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.940879 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities" (OuterVolumeSpecName: "utilities") pod "fc70b8cc-0865-43ad-b407-0205067634dc" (UID: "fc70b8cc-0865-43ad-b407-0205067634dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.954306 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpgsr\" (UniqueName: \"kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr\") pod \"fc70b8cc-0865-43ad-b407-0205067634dc\" (UID: \"fc70b8cc-0865-43ad-b407-0205067634dc\") " Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.957431 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:57 crc kubenswrapper[4774]: I1001 13:40:57.960167 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr" (OuterVolumeSpecName: "kube-api-access-zpgsr") pod "fc70b8cc-0865-43ad-b407-0205067634dc" (UID: "fc70b8cc-0865-43ad-b407-0205067634dc"). InnerVolumeSpecName "kube-api-access-zpgsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.037093 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc70b8cc-0865-43ad-b407-0205067634dc" (UID: "fc70b8cc-0865-43ad-b407-0205067634dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.059005 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc70b8cc-0865-43ad-b407-0205067634dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.059046 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpgsr\" (UniqueName: \"kubernetes.io/projected/fc70b8cc-0865-43ad-b407-0205067634dc-kube-api-access-zpgsr\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.566259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmh9n" event={"ID":"fc70b8cc-0865-43ad-b407-0205067634dc","Type":"ContainerDied","Data":"dd7ddf25a7cd26e66ffdda33be3d0aa9764e68efe10b7520d80b277a6a80ce5d"} Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.567296 4774 scope.go:117] "RemoveContainer" containerID="28dc34d81da3cde75e0da6e3b29f6cc4d7c43466ba10cbdb65795ecd00aa1ff2" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.566338 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmh9n" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.587399 4774 scope.go:117] "RemoveContainer" containerID="77f2e6e9574a4391d9978d23eee14eb3b7f692c044d940b9e3474b06d55680a0" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.595137 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.598774 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jmh9n"] Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.602192 4774 scope.go:117] "RemoveContainer" containerID="6df3d893038350f0779aa7726ab40293e257b18369524589f9cf7363e936199d" Oct 01 13:40:58 crc kubenswrapper[4774]: I1001 13:40:58.880423 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" path="/var/lib/kubelet/pods/fc70b8cc-0865-43ad-b407-0205067634dc/volumes" Oct 01 13:41:03 crc kubenswrapper[4774]: I1001 13:41:03.373930 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:41:03 crc kubenswrapper[4774]: I1001 13:41:03.419309 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:41:16 crc kubenswrapper[4774]: I1001 13:41:16.878044 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerName="oauth-openshift" containerID="cri-o://90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046" gracePeriod=15 Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.344965 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410496 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-96d6999f9-4m2cx"] Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410757 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410771 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410790 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410798 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410809 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410816 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410827 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410835 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410847 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d829f3-7369-4ca1-a2b4-d4463250496f" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410855 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d829f3-7369-4ca1-a2b4-d4463250496f" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410866 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410873 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410885 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410892 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410900 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410910 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410923 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410930 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410941 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410949 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410958 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerName="oauth-openshift" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410967 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerName="oauth-openshift" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410979 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.410988 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.410997 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411007 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="extract-utilities" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.411018 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411025 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.411037 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411045 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="extract-content" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411146 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f32cd0-a153-4829-9671-bcf21227cc13" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411158 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="43108b2a-ea14-4eff-a9ed-e9600f1d9ae2" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411169 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc70b8cc-0865-43ad-b407-0205067634dc" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411180 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d829f3-7369-4ca1-a2b4-d4463250496f" containerName="pruner" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411189 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerName="oauth-openshift" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411198 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e9ea5d-233d-4792-ab12-da809240a05c" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411208 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5248d9d-f880-4802-ab66-c9ecf256c2b8" containerName="registry-server" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.411682 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.418563 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-96d6999f9-4m2cx"] Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.419562 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.419606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.419641 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.419681 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.419959 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-cliconfig\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420004 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-audit-policies\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420136 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420178 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420214 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420248 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-login\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420247 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420518 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-service-ca\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420739 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-router-certs\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.420856 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-serving-cert\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421091 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85495b3d-e2ee-4d55-9567-0104624c6930-audit-dir\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421119 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421333 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txjjz\" (UniqueName: \"kubernetes.io/projected/85495b3d-e2ee-4d55-9567-0104624c6930-kube-api-access-txjjz\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421545 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-error\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421722 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421815 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-session\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.421956 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.422063 4774 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.434632 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.435109 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522519 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522607 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522674 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522766 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.522976 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmrxw\" (UniqueName: \"kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523039 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523095 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523155 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523287 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection\") pod \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\" (UID: \"f03e45d8-27cd-4b18-87cf-e5abce7af1e2\") " Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523683 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-cliconfig\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523687 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523768 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-audit-policies\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523916 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.523974 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524058 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-login\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524120 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-service-ca\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524204 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-router-certs\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524246 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-serving-cert\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524552 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85495b3d-e2ee-4d55-9567-0104624c6930-audit-dir\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524598 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txjjz\" (UniqueName: \"kubernetes.io/projected/85495b3d-e2ee-4d55-9567-0104624c6930-kube-api-access-txjjz\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524637 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-error\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524780 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-session\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.524794 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.525101 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-cliconfig\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.525327 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-audit-policies\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.526398 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.526605 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-service-ca\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.528975 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.529063 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530153 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-login\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530294 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530414 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/85495b3d-e2ee-4d55-9567-0104624c6930-audit-dir\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530537 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530739 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530795 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530828 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.530931 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.531139 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.531217 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.531756 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.531988 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.532151 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-serving-cert\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.532895 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw" (OuterVolumeSpecName: "kube-api-access-fmrxw") pod "f03e45d8-27cd-4b18-87cf-e5abce7af1e2" (UID: "f03e45d8-27cd-4b18-87cf-e5abce7af1e2"). InnerVolumeSpecName "kube-api-access-fmrxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.533356 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.534254 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-session\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.535063 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-user-template-error\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.536710 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/85495b3d-e2ee-4d55-9567-0104624c6930-v4-0-config-system-router-certs\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.561038 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txjjz\" (UniqueName: \"kubernetes.io/projected/85495b3d-e2ee-4d55-9567-0104624c6930-kube-api-access-txjjz\") pod \"oauth-openshift-96d6999f9-4m2cx\" (UID: \"85495b3d-e2ee-4d55-9567-0104624c6930\") " pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632025 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632075 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632095 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632113 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632132 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmrxw\" (UniqueName: \"kubernetes.io/projected/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-kube-api-access-fmrxw\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632152 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632170 4774 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.632190 4774 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f03e45d8-27cd-4b18-87cf-e5abce7af1e2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.695425 4774 generic.go:334] "Generic (PLEG): container finished" podID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" containerID="90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046" exitCode=0 Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.695615 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" event={"ID":"f03e45d8-27cd-4b18-87cf-e5abce7af1e2","Type":"ContainerDied","Data":"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046"} Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.695695 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" event={"ID":"f03e45d8-27cd-4b18-87cf-e5abce7af1e2","Type":"ContainerDied","Data":"a22c9dca26401e277d2f21a135c3aafff30f0e0836550d4aca1f2f373a9ad8ff"} Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.695869 4774 scope.go:117] "RemoveContainer" containerID="90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.696191 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-chw44" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.728839 4774 scope.go:117] "RemoveContainer" containerID="90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046" Oct 01 13:41:17 crc kubenswrapper[4774]: E1001 13:41:17.729341 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046\": container with ID starting with 90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046 not found: ID does not exist" containerID="90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.729427 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046"} err="failed to get container status \"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046\": rpc error: code = NotFound desc = could not find container \"90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046\": container with ID starting with 90a723816629a2715d429eee5fa94ed1df36531569612f69d42565cab1983046 not found: ID does not exist" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.753812 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.765565 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.769182 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-chw44"] Oct 01 13:41:17 crc kubenswrapper[4774]: I1001 13:41:17.956002 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-96d6999f9-4m2cx"] Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.706073 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" event={"ID":"85495b3d-e2ee-4d55-9567-0104624c6930","Type":"ContainerStarted","Data":"82b239a2619cf6454b6cca3349efd8587f8933dd364bf999f33812cca5ff4d72"} Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.706565 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.706588 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" event={"ID":"85495b3d-e2ee-4d55-9567-0104624c6930","Type":"ContainerStarted","Data":"c9efb325efafbb5f26f2b87c332098786313bd5e6575ded243b1d860b5cceafc"} Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.717408 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.740655 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-96d6999f9-4m2cx" podStartSLOduration=27.740620574 podStartE2EDuration="27.740620574s" podCreationTimestamp="2025-10-01 13:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:41:18.739007679 +0000 UTC m=+250.628638366" watchObservedRunningTime="2025-10-01 13:41:18.740620574 +0000 UTC m=+250.630251211" Oct 01 13:41:18 crc kubenswrapper[4774]: I1001 13:41:18.877164 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f03e45d8-27cd-4b18-87cf-e5abce7af1e2" path="/var/lib/kubelet/pods/f03e45d8-27cd-4b18-87cf-e5abce7af1e2/volumes" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.665480 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.666749 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjxls" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="registry-server" containerID="cri-o://9b25f05ddc617a644d6b30aa332532ac2228225fc6b97d821fadc648b4045a21" gracePeriod=30 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.676654 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.677967 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7fkzd" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="registry-server" containerID="cri-o://58523b0d8539685f41710699ed361f42610a560bef5f23c7790e2fdf340adbd9" gracePeriod=30 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.694128 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.694383 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" containerID="cri-o://99790d2faf60f93a9d2fcd7bd8c63a8c2bce875a95f85b7917c44a09c46a0bab" gracePeriod=30 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.703280 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.703643 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rgjkf" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="registry-server" containerID="cri-o://27f7d9a09745e4820f9afaa995d2cfccd7f51adbc22c616cd08d30d4d0438ba3" gracePeriod=30 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.712474 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.713019 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2p9p" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="registry-server" containerID="cri-o://36663b761e0aa4d76e41ac26f45b632cf91e8e3622932e14e7ae21b0ec113b1e" gracePeriod=30 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.716306 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zndj2"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.717679 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.719850 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zndj2"] Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.834479 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.834908 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.834955 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7fl5\" (UniqueName: \"kubernetes.io/projected/3115994a-f7c2-410d-957b-0b08edee5125-kube-api-access-h7fl5\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.896752 4774 generic.go:334] "Generic (PLEG): container finished" podID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerID="9b25f05ddc617a644d6b30aa332532ac2228225fc6b97d821fadc648b4045a21" exitCode=0 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.896809 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerDied","Data":"9b25f05ddc617a644d6b30aa332532ac2228225fc6b97d821fadc648b4045a21"} Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.899119 4774 generic.go:334] "Generic (PLEG): container finished" podID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerID="27f7d9a09745e4820f9afaa995d2cfccd7f51adbc22c616cd08d30d4d0438ba3" exitCode=0 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.899162 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerDied","Data":"27f7d9a09745e4820f9afaa995d2cfccd7f51adbc22c616cd08d30d4d0438ba3"} Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.900673 4774 generic.go:334] "Generic (PLEG): container finished" podID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerID="99790d2faf60f93a9d2fcd7bd8c63a8c2bce875a95f85b7917c44a09c46a0bab" exitCode=0 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.900721 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" event={"ID":"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6","Type":"ContainerDied","Data":"99790d2faf60f93a9d2fcd7bd8c63a8c2bce875a95f85b7917c44a09c46a0bab"} Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.902288 4774 generic.go:334] "Generic (PLEG): container finished" podID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerID="58523b0d8539685f41710699ed361f42610a560bef5f23c7790e2fdf340adbd9" exitCode=0 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.902323 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerDied","Data":"58523b0d8539685f41710699ed361f42610a560bef5f23c7790e2fdf340adbd9"} Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.904225 4774 generic.go:334] "Generic (PLEG): container finished" podID="0930b18d-4424-4085-917f-114bb3efe343" containerID="36663b761e0aa4d76e41ac26f45b632cf91e8e3622932e14e7ae21b0ec113b1e" exitCode=0 Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.904247 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerDied","Data":"36663b761e0aa4d76e41ac26f45b632cf91e8e3622932e14e7ae21b0ec113b1e"} Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.936202 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.936262 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.936308 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7fl5\" (UniqueName: \"kubernetes.io/projected/3115994a-f7c2-410d-957b-0b08edee5125-kube-api-access-h7fl5\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.938602 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.944813 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3115994a-f7c2-410d-957b-0b08edee5125-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:47 crc kubenswrapper[4774]: I1001 13:41:47.952166 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7fl5\" (UniqueName: \"kubernetes.io/projected/3115994a-f7c2-410d-957b-0b08edee5125-kube-api-access-h7fl5\") pod \"marketplace-operator-79b997595-zndj2\" (UID: \"3115994a-f7c2-410d-957b-0b08edee5125\") " pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.034173 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.118316 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.132362 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.169573 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.169972 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.178100 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.242894 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content\") pod \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.242985 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlsdx\" (UniqueName: \"kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx\") pod \"0930b18d-4424-4085-917f-114bb3efe343\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.243018 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities\") pod \"0930b18d-4424-4085-917f-114bb3efe343\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.243076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content\") pod \"0930b18d-4424-4085-917f-114bb3efe343\" (UID: \"0930b18d-4424-4085-917f-114bb3efe343\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.243135 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4rm\" (UniqueName: \"kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm\") pod \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.243258 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities\") pod \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\" (UID: \"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.244121 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities" (OuterVolumeSpecName: "utilities") pod "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" (UID: "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.244418 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.247105 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities" (OuterVolumeSpecName: "utilities") pod "0930b18d-4424-4085-917f-114bb3efe343" (UID: "0930b18d-4424-4085-917f-114bb3efe343"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.254697 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm" (OuterVolumeSpecName: "kube-api-access-2d4rm") pod "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" (UID: "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8"). InnerVolumeSpecName "kube-api-access-2d4rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.267169 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx" (OuterVolumeSpecName: "kube-api-access-mlsdx") pod "0930b18d-4424-4085-917f-114bb3efe343" (UID: "0930b18d-4424-4085-917f-114bb3efe343"). InnerVolumeSpecName "kube-api-access-mlsdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.310509 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" (UID: "d47e8bfd-f0b4-494b-9125-7b0dcf336ff8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.326685 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0930b18d-4424-4085-917f-114bb3efe343" (UID: "0930b18d-4424-4085-917f-114bb3efe343"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.346936 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca\") pod \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347012 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hljfn\" (UniqueName: \"kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn\") pod \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347051 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg6n9\" (UniqueName: \"kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9\") pod \"9e764fef-cdd8-40bc-88cb-cad642239efc\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347099 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content\") pod \"a572962b-dbdb-4313-ada5-20c8eddfed11\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347126 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics\") pod \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\" (UID: \"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347161 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ddcf\" (UniqueName: \"kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf\") pod \"a572962b-dbdb-4313-ada5-20c8eddfed11\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347203 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities\") pod \"a572962b-dbdb-4313-ada5-20c8eddfed11\" (UID: \"a572962b-dbdb-4313-ada5-20c8eddfed11\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347270 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content\") pod \"9e764fef-cdd8-40bc-88cb-cad642239efc\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347299 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities\") pod \"9e764fef-cdd8-40bc-88cb-cad642239efc\" (UID: \"9e764fef-cdd8-40bc-88cb-cad642239efc\") " Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347559 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347583 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlsdx\" (UniqueName: \"kubernetes.io/projected/0930b18d-4424-4085-917f-114bb3efe343-kube-api-access-mlsdx\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347599 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347612 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930b18d-4424-4085-917f-114bb3efe343-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.347624 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4rm\" (UniqueName: \"kubernetes.io/projected/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8-kube-api-access-2d4rm\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.348095 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" (UID: "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.348902 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities" (OuterVolumeSpecName: "utilities") pod "a572962b-dbdb-4313-ada5-20c8eddfed11" (UID: "a572962b-dbdb-4313-ada5-20c8eddfed11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.349162 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities" (OuterVolumeSpecName: "utilities") pod "9e764fef-cdd8-40bc-88cb-cad642239efc" (UID: "9e764fef-cdd8-40bc-88cb-cad642239efc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.351019 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" (UID: "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.352044 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf" (OuterVolumeSpecName: "kube-api-access-5ddcf") pod "a572962b-dbdb-4313-ada5-20c8eddfed11" (UID: "a572962b-dbdb-4313-ada5-20c8eddfed11"). InnerVolumeSpecName "kube-api-access-5ddcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.352491 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn" (OuterVolumeSpecName: "kube-api-access-hljfn") pod "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" (UID: "72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6"). InnerVolumeSpecName "kube-api-access-hljfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.352966 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9" (OuterVolumeSpecName: "kube-api-access-xg6n9") pod "9e764fef-cdd8-40bc-88cb-cad642239efc" (UID: "9e764fef-cdd8-40bc-88cb-cad642239efc"). InnerVolumeSpecName "kube-api-access-xg6n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.368234 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e764fef-cdd8-40bc-88cb-cad642239efc" (UID: "9e764fef-cdd8-40bc-88cb-cad642239efc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.417328 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a572962b-dbdb-4313-ada5-20c8eddfed11" (UID: "a572962b-dbdb-4313-ada5-20c8eddfed11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449110 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ddcf\" (UniqueName: \"kubernetes.io/projected/a572962b-dbdb-4313-ada5-20c8eddfed11-kube-api-access-5ddcf\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449161 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449217 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449232 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e764fef-cdd8-40bc-88cb-cad642239efc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449281 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449299 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hljfn\" (UniqueName: \"kubernetes.io/projected/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-kube-api-access-hljfn\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449313 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg6n9\" (UniqueName: \"kubernetes.io/projected/9e764fef-cdd8-40bc-88cb-cad642239efc-kube-api-access-xg6n9\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449328 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a572962b-dbdb-4313-ada5-20c8eddfed11-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.449396 4774 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.508092 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zndj2"] Oct 01 13:41:48 crc kubenswrapper[4774]: W1001 13:41:48.520405 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3115994a_f7c2_410d_957b_0b08edee5125.slice/crio-924d92b05249c2419c5d1b2b7adf496c8dbcea101dd30f020eacee1cb8eff092 WatchSource:0}: Error finding container 924d92b05249c2419c5d1b2b7adf496c8dbcea101dd30f020eacee1cb8eff092: Status 404 returned error can't find the container with id 924d92b05249c2419c5d1b2b7adf496c8dbcea101dd30f020eacee1cb8eff092 Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.916152 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgjkf" event={"ID":"9e764fef-cdd8-40bc-88cb-cad642239efc","Type":"ContainerDied","Data":"99016bc816a33089a44bd10ee21a08ab1438eb6a16d8e06f3f1acbf578d9b1b6"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.916227 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgjkf" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.916516 4774 scope.go:117] "RemoveContainer" containerID="27f7d9a09745e4820f9afaa995d2cfccd7f51adbc22c616cd08d30d4d0438ba3" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.918244 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.918425 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kxbs7" event={"ID":"72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6","Type":"ContainerDied","Data":"497971d059a5296cdc5cd92977b858e66ecef674aa22a20fa72183bbaefa2938"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.922190 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fkzd" event={"ID":"a572962b-dbdb-4313-ada5-20c8eddfed11","Type":"ContainerDied","Data":"c53780b4c63b004bd134cdd1db99595dfe6a19e742700a8d1987c20b442e86b1"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.922295 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fkzd" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.925336 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" event={"ID":"3115994a-f7c2-410d-957b-0b08edee5125","Type":"ContainerStarted","Data":"204d111fd9da283e30cc859c9aaea9d7612b248e5d37260ed9e88b233e958686"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.925410 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" event={"ID":"3115994a-f7c2-410d-957b-0b08edee5125","Type":"ContainerStarted","Data":"924d92b05249c2419c5d1b2b7adf496c8dbcea101dd30f020eacee1cb8eff092"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.925574 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.930006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2p9p" event={"ID":"0930b18d-4424-4085-917f-114bb3efe343","Type":"ContainerDied","Data":"e70b26636cbcd58b0784a1125d09382b62ed7bfd7cdac31258d3bcf0cf2cd7b6"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.930023 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2p9p" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.931130 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.934267 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxls" event={"ID":"d47e8bfd-f0b4-494b-9125-7b0dcf336ff8","Type":"ContainerDied","Data":"d8a51a8ff1c5ed214d3e9a65a1e55b737b3aed5292cdba59b5fae6d7eb0f7834"} Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.934389 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxls" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.948890 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zndj2" podStartSLOduration=1.948867608 podStartE2EDuration="1.948867608s" podCreationTimestamp="2025-10-01 13:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:41:48.94584825 +0000 UTC m=+280.835478868" watchObservedRunningTime="2025-10-01 13:41:48.948867608 +0000 UTC m=+280.838498215" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.954074 4774 scope.go:117] "RemoveContainer" containerID="0daa6951dc8600bdf129ea7c0059ec137b09468ef092d65eb8e5e07d118ae776" Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.963523 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:41:48 crc kubenswrapper[4774]: I1001 13:41:48.966871 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7fkzd"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.008711 4774 scope.go:117] "RemoveContainer" containerID="f3897e7617c265bc7a59365b62498159860ced5621a9d6d71ea444549ff5b9dd" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.026254 4774 scope.go:117] "RemoveContainer" containerID="99790d2faf60f93a9d2fcd7bd8c63a8c2bce875a95f85b7917c44a09c46a0bab" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.029159 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.037303 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kxbs7"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.042874 4774 scope.go:117] "RemoveContainer" containerID="58523b0d8539685f41710699ed361f42610a560bef5f23c7790e2fdf340adbd9" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.044370 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.050917 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2p9p"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.054694 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.061554 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjxls"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.064963 4774 scope.go:117] "RemoveContainer" containerID="979472644c5aa28538b5bbd0014643f8c7e5f50933da1458307c967409bfb22a" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.068910 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.072054 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgjkf"] Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.098641 4774 scope.go:117] "RemoveContainer" containerID="95e384c14f902e6a12246a2e6543e0a0760e91b4fa12c800970685f65ba6c907" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.109328 4774 scope.go:117] "RemoveContainer" containerID="36663b761e0aa4d76e41ac26f45b632cf91e8e3622932e14e7ae21b0ec113b1e" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.126602 4774 scope.go:117] "RemoveContainer" containerID="fba3b6f0437556dfa716964ef321b064e983f89f888247bdc2b83c5fa20fc487" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.140123 4774 scope.go:117] "RemoveContainer" containerID="3fe85745d27ca09ea26a876f54e2af4748773b5df9a07e1ac0c5c70967208343" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.152680 4774 scope.go:117] "RemoveContainer" containerID="9b25f05ddc617a644d6b30aa332532ac2228225fc6b97d821fadc648b4045a21" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.165585 4774 scope.go:117] "RemoveContainer" containerID="37af91829c195307452de213df4229b6410c6c1350935ebf65b03f043402eb59" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.178159 4774 scope.go:117] "RemoveContainer" containerID="6465d5511a55940f8351014baa8c6a7ed72cb594b375d7ab753e5807886314bf" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876289 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c9bgd"] Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876515 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876528 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876538 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876544 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876551 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876557 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876563 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876568 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876578 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876585 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876593 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876600 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876607 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876614 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876621 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876626 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876633 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876639 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876647 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876653 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876661 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876667 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876675 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876680 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="extract-content" Oct 01 13:41:49 crc kubenswrapper[4774]: E1001 13:41:49.876689 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876695 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="extract-utilities" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876771 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" containerName="marketplace-operator" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876780 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876786 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930b18d-4424-4085-917f-114bb3efe343" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876796 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.876804 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" containerName="registry-server" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.877420 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.880518 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 13:41:49 crc kubenswrapper[4774]: I1001 13:41:49.888209 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9bgd"] Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.069920 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-utilities\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.070020 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-catalog-content\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.070055 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9jm\" (UniqueName: \"kubernetes.io/projected/1fd81359-4e78-4195-958e-f6a4e859cf2c-kube-api-access-4b9jm\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.076365 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pm5fh"] Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.079857 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.083133 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm5fh"] Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.083397 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171352 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-utilities\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171665 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-catalog-content\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171769 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9jm\" (UniqueName: \"kubernetes.io/projected/1fd81359-4e78-4195-958e-f6a4e859cf2c-kube-api-access-4b9jm\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171857 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndnk\" (UniqueName: \"kubernetes.io/projected/374500ba-b989-44b3-bae2-1df03f16da01-kube-api-access-pndnk\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-utilities\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171982 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd81359-4e78-4195-958e-f6a4e859cf2c-catalog-content\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.171995 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-catalog-content\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.172141 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-utilities\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.193155 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9jm\" (UniqueName: \"kubernetes.io/projected/1fd81359-4e78-4195-958e-f6a4e859cf2c-kube-api-access-4b9jm\") pod \"redhat-marketplace-c9bgd\" (UID: \"1fd81359-4e78-4195-958e-f6a4e859cf2c\") " pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.205704 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.276737 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndnk\" (UniqueName: \"kubernetes.io/projected/374500ba-b989-44b3-bae2-1df03f16da01-kube-api-access-pndnk\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.277142 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-catalog-content\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.277874 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-catalog-content\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.281399 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-utilities\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.281836 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/374500ba-b989-44b3-bae2-1df03f16da01-utilities\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.295578 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndnk\" (UniqueName: \"kubernetes.io/projected/374500ba-b989-44b3-bae2-1df03f16da01-kube-api-access-pndnk\") pod \"certified-operators-pm5fh\" (UID: \"374500ba-b989-44b3-bae2-1df03f16da01\") " pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.403310 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.420356 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c9bgd"] Oct 01 13:41:50 crc kubenswrapper[4774]: W1001 13:41:50.425489 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd81359_4e78_4195_958e_f6a4e859cf2c.slice/crio-4820216e6329d5e1ed59fdedd8e8d62b9ae2ae3831abdd57d816e3e08b2de9b8 WatchSource:0}: Error finding container 4820216e6329d5e1ed59fdedd8e8d62b9ae2ae3831abdd57d816e3e08b2de9b8: Status 404 returned error can't find the container with id 4820216e6329d5e1ed59fdedd8e8d62b9ae2ae3831abdd57d816e3e08b2de9b8 Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.551932 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pm5fh"] Oct 01 13:41:50 crc kubenswrapper[4774]: W1001 13:41:50.559763 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374500ba_b989_44b3_bae2_1df03f16da01.slice/crio-149b7ca764213eb7afd67470e5a97dd2e7546ff21fad6cc0aa33445c0399325a WatchSource:0}: Error finding container 149b7ca764213eb7afd67470e5a97dd2e7546ff21fad6cc0aa33445c0399325a: Status 404 returned error can't find the container with id 149b7ca764213eb7afd67470e5a97dd2e7546ff21fad6cc0aa33445c0399325a Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.877660 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0930b18d-4424-4085-917f-114bb3efe343" path="/var/lib/kubelet/pods/0930b18d-4424-4085-917f-114bb3efe343/volumes" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.878573 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6" path="/var/lib/kubelet/pods/72ac1ea7-63b4-46c2-b0d3-42bbb5cd23b6/volumes" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.878993 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e764fef-cdd8-40bc-88cb-cad642239efc" path="/var/lib/kubelet/pods/9e764fef-cdd8-40bc-88cb-cad642239efc/volumes" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.879541 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a572962b-dbdb-4313-ada5-20c8eddfed11" path="/var/lib/kubelet/pods/a572962b-dbdb-4313-ada5-20c8eddfed11/volumes" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.880091 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d47e8bfd-f0b4-494b-9125-7b0dcf336ff8" path="/var/lib/kubelet/pods/d47e8bfd-f0b4-494b-9125-7b0dcf336ff8/volumes" Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.950413 4774 generic.go:334] "Generic (PLEG): container finished" podID="1fd81359-4e78-4195-958e-f6a4e859cf2c" containerID="7cedda53707432d69b2a2591d9a140e4b0578b2c56a4eb9d27f7416ed6260b57" exitCode=0 Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.950548 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9bgd" event={"ID":"1fd81359-4e78-4195-958e-f6a4e859cf2c","Type":"ContainerDied","Data":"7cedda53707432d69b2a2591d9a140e4b0578b2c56a4eb9d27f7416ed6260b57"} Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.950578 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9bgd" event={"ID":"1fd81359-4e78-4195-958e-f6a4e859cf2c","Type":"ContainerStarted","Data":"4820216e6329d5e1ed59fdedd8e8d62b9ae2ae3831abdd57d816e3e08b2de9b8"} Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.952246 4774 generic.go:334] "Generic (PLEG): container finished" podID="374500ba-b989-44b3-bae2-1df03f16da01" containerID="66984ba30f187516e663105edda007cef587e1e6c7a53d504571a609d38cbc4a" exitCode=0 Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.952289 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5fh" event={"ID":"374500ba-b989-44b3-bae2-1df03f16da01","Type":"ContainerDied","Data":"66984ba30f187516e663105edda007cef587e1e6c7a53d504571a609d38cbc4a"} Oct 01 13:41:50 crc kubenswrapper[4774]: I1001 13:41:50.952309 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5fh" event={"ID":"374500ba-b989-44b3-bae2-1df03f16da01","Type":"ContainerStarted","Data":"149b7ca764213eb7afd67470e5a97dd2e7546ff21fad6cc0aa33445c0399325a"} Oct 01 13:41:51 crc kubenswrapper[4774]: I1001 13:41:51.959267 4774 generic.go:334] "Generic (PLEG): container finished" podID="374500ba-b989-44b3-bae2-1df03f16da01" containerID="68b96e48264c2eaadf53061f1fbc32bb15b72f5ea6d3862d956bdfcc8797ab82" exitCode=0 Oct 01 13:41:51 crc kubenswrapper[4774]: I1001 13:41:51.959375 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5fh" event={"ID":"374500ba-b989-44b3-bae2-1df03f16da01","Type":"ContainerDied","Data":"68b96e48264c2eaadf53061f1fbc32bb15b72f5ea6d3862d956bdfcc8797ab82"} Oct 01 13:41:51 crc kubenswrapper[4774]: I1001 13:41:51.961979 4774 generic.go:334] "Generic (PLEG): container finished" podID="1fd81359-4e78-4195-958e-f6a4e859cf2c" containerID="8010cf6b2e2994dcd89501234e9fd882bd57387016d44d58f4125fff9a2d2940" exitCode=0 Oct 01 13:41:51 crc kubenswrapper[4774]: I1001 13:41:51.962010 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9bgd" event={"ID":"1fd81359-4e78-4195-958e-f6a4e859cf2c","Type":"ContainerDied","Data":"8010cf6b2e2994dcd89501234e9fd882bd57387016d44d58f4125fff9a2d2940"} Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.289202 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cvpbg"] Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.291808 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvpbg"] Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.292131 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.296157 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.413073 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt8w6\" (UniqueName: \"kubernetes.io/projected/a1970618-5299-4a91-a1c7-d767f8ed21d9-kube-api-access-nt8w6\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.413301 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-catalog-content\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.413440 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-utilities\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.483837 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrzqd"] Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.485087 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.486680 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrzqd"] Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.488842 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.514145 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt8w6\" (UniqueName: \"kubernetes.io/projected/a1970618-5299-4a91-a1c7-d767f8ed21d9-kube-api-access-nt8w6\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.514197 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-catalog-content\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.514243 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-utilities\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.514664 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-utilities\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.515192 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1970618-5299-4a91-a1c7-d767f8ed21d9-catalog-content\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.535322 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt8w6\" (UniqueName: \"kubernetes.io/projected/a1970618-5299-4a91-a1c7-d767f8ed21d9-kube-api-access-nt8w6\") pod \"redhat-operators-cvpbg\" (UID: \"a1970618-5299-4a91-a1c7-d767f8ed21d9\") " pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.615031 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-catalog-content\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.615087 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25lhw\" (UniqueName: \"kubernetes.io/projected/a7665b71-d4e4-4a6a-88dd-a17afc725e54-kube-api-access-25lhw\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.615218 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-utilities\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.632140 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.723208 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-catalog-content\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.723639 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25lhw\" (UniqueName: \"kubernetes.io/projected/a7665b71-d4e4-4a6a-88dd-a17afc725e54-kube-api-access-25lhw\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.723665 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-utilities\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.724348 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-catalog-content\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.724369 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7665b71-d4e4-4a6a-88dd-a17afc725e54-utilities\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.749818 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25lhw\" (UniqueName: \"kubernetes.io/projected/a7665b71-d4e4-4a6a-88dd-a17afc725e54-kube-api-access-25lhw\") pod \"community-operators-wrzqd\" (UID: \"a7665b71-d4e4-4a6a-88dd-a17afc725e54\") " pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.842769 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cvpbg"] Oct 01 13:41:52 crc kubenswrapper[4774]: W1001 13:41:52.850657 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1970618_5299_4a91_a1c7_d767f8ed21d9.slice/crio-7c350d2e67999af096707a4a83c7c11f3936f1a1eb0427f9905750af10ed91ba WatchSource:0}: Error finding container 7c350d2e67999af096707a4a83c7c11f3936f1a1eb0427f9905750af10ed91ba: Status 404 returned error can't find the container with id 7c350d2e67999af096707a4a83c7c11f3936f1a1eb0427f9905750af10ed91ba Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.888391 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.968790 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvpbg" event={"ID":"a1970618-5299-4a91-a1c7-d767f8ed21d9","Type":"ContainerStarted","Data":"7c350d2e67999af096707a4a83c7c11f3936f1a1eb0427f9905750af10ed91ba"} Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.985122 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c9bgd" event={"ID":"1fd81359-4e78-4195-958e-f6a4e859cf2c","Type":"ContainerStarted","Data":"13b84c377603da798984dde7ab178ee8314873ff91c5f5c6b88a5fa97f9dd13f"} Oct 01 13:41:52 crc kubenswrapper[4774]: I1001 13:41:52.994184 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pm5fh" event={"ID":"374500ba-b989-44b3-bae2-1df03f16da01","Type":"ContainerStarted","Data":"db8c825080c6682e64b178b8c50d897f3f39a40f9e8c0d9b8c62e604752300bd"} Oct 01 13:41:53 crc kubenswrapper[4774]: I1001 13:41:53.012068 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c9bgd" podStartSLOduration=2.472011873 podStartE2EDuration="4.01204836s" podCreationTimestamp="2025-10-01 13:41:49 +0000 UTC" firstStartedPulling="2025-10-01 13:41:50.951709615 +0000 UTC m=+282.841340222" lastFinishedPulling="2025-10-01 13:41:52.491746122 +0000 UTC m=+284.381376709" observedRunningTime="2025-10-01 13:41:53.007815127 +0000 UTC m=+284.897445784" watchObservedRunningTime="2025-10-01 13:41:53.01204836 +0000 UTC m=+284.901678957" Oct 01 13:41:53 crc kubenswrapper[4774]: I1001 13:41:53.024310 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pm5fh" podStartSLOduration=1.574542138 podStartE2EDuration="3.024298046s" podCreationTimestamp="2025-10-01 13:41:50 +0000 UTC" firstStartedPulling="2025-10-01 13:41:50.953204538 +0000 UTC m=+282.842835135" lastFinishedPulling="2025-10-01 13:41:52.402960446 +0000 UTC m=+284.292591043" observedRunningTime="2025-10-01 13:41:53.023439271 +0000 UTC m=+284.913069868" watchObservedRunningTime="2025-10-01 13:41:53.024298046 +0000 UTC m=+284.913928643" Oct 01 13:41:53 crc kubenswrapper[4774]: I1001 13:41:53.308574 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrzqd"] Oct 01 13:41:54 crc kubenswrapper[4774]: I1001 13:41:54.001844 4774 generic.go:334] "Generic (PLEG): container finished" podID="a7665b71-d4e4-4a6a-88dd-a17afc725e54" containerID="f676e92ba1b7bb230036cb658bacca41c6d9ec86959f3f92a3aa19ecc6acddac" exitCode=0 Oct 01 13:41:54 crc kubenswrapper[4774]: I1001 13:41:54.001900 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrzqd" event={"ID":"a7665b71-d4e4-4a6a-88dd-a17afc725e54","Type":"ContainerDied","Data":"f676e92ba1b7bb230036cb658bacca41c6d9ec86959f3f92a3aa19ecc6acddac"} Oct 01 13:41:54 crc kubenswrapper[4774]: I1001 13:41:54.002118 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrzqd" event={"ID":"a7665b71-d4e4-4a6a-88dd-a17afc725e54","Type":"ContainerStarted","Data":"cca3c80f55abbbbed7165a92e825d3ed9034524f0567568cce3c5a8a8a50160f"} Oct 01 13:41:54 crc kubenswrapper[4774]: I1001 13:41:54.003665 4774 generic.go:334] "Generic (PLEG): container finished" podID="a1970618-5299-4a91-a1c7-d767f8ed21d9" containerID="0bb473a26383f239e27bd5059d578cf8c92e6edc78785da8a8920e91c0b6b72b" exitCode=0 Oct 01 13:41:54 crc kubenswrapper[4774]: I1001 13:41:54.003764 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvpbg" event={"ID":"a1970618-5299-4a91-a1c7-d767f8ed21d9","Type":"ContainerDied","Data":"0bb473a26383f239e27bd5059d578cf8c92e6edc78785da8a8920e91c0b6b72b"} Oct 01 13:41:56 crc kubenswrapper[4774]: I1001 13:41:56.016667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvpbg" event={"ID":"a1970618-5299-4a91-a1c7-d767f8ed21d9","Type":"ContainerStarted","Data":"cad1e3ef124aa403eabbb4fce65739d652a049b3299b3b4465f91790cf30cfac"} Oct 01 13:41:56 crc kubenswrapper[4774]: I1001 13:41:56.019205 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrzqd" event={"ID":"a7665b71-d4e4-4a6a-88dd-a17afc725e54","Type":"ContainerStarted","Data":"84096d5fc486c027b272c6f8e2d9bcae8e8676f32a989d99a98278c0e944bbe1"} Oct 01 13:41:57 crc kubenswrapper[4774]: I1001 13:41:57.033416 4774 generic.go:334] "Generic (PLEG): container finished" podID="a7665b71-d4e4-4a6a-88dd-a17afc725e54" containerID="84096d5fc486c027b272c6f8e2d9bcae8e8676f32a989d99a98278c0e944bbe1" exitCode=0 Oct 01 13:41:57 crc kubenswrapper[4774]: I1001 13:41:57.033516 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrzqd" event={"ID":"a7665b71-d4e4-4a6a-88dd-a17afc725e54","Type":"ContainerDied","Data":"84096d5fc486c027b272c6f8e2d9bcae8e8676f32a989d99a98278c0e944bbe1"} Oct 01 13:41:57 crc kubenswrapper[4774]: I1001 13:41:57.036728 4774 generic.go:334] "Generic (PLEG): container finished" podID="a1970618-5299-4a91-a1c7-d767f8ed21d9" containerID="cad1e3ef124aa403eabbb4fce65739d652a049b3299b3b4465f91790cf30cfac" exitCode=0 Oct 01 13:41:57 crc kubenswrapper[4774]: I1001 13:41:57.036789 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvpbg" event={"ID":"a1970618-5299-4a91-a1c7-d767f8ed21d9","Type":"ContainerDied","Data":"cad1e3ef124aa403eabbb4fce65739d652a049b3299b3b4465f91790cf30cfac"} Oct 01 13:41:58 crc kubenswrapper[4774]: I1001 13:41:58.045208 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cvpbg" event={"ID":"a1970618-5299-4a91-a1c7-d767f8ed21d9","Type":"ContainerStarted","Data":"078f20e6e85409a0aa3fc182859895e28c807c8dc09d9499ae7f9bbf0489bc19"} Oct 01 13:41:58 crc kubenswrapper[4774]: I1001 13:41:58.047690 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrzqd" event={"ID":"a7665b71-d4e4-4a6a-88dd-a17afc725e54","Type":"ContainerStarted","Data":"f33d07e944e0ac0ae89bfec1de04327b6e6ffd3a381cdf6ab7a354c2362c496a"} Oct 01 13:41:58 crc kubenswrapper[4774]: I1001 13:41:58.064920 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cvpbg" podStartSLOduration=2.623718836 podStartE2EDuration="6.064900149s" podCreationTimestamp="2025-10-01 13:41:52 +0000 UTC" firstStartedPulling="2025-10-01 13:41:54.006550927 +0000 UTC m=+285.896181524" lastFinishedPulling="2025-10-01 13:41:57.44773224 +0000 UTC m=+289.337362837" observedRunningTime="2025-10-01 13:41:58.060638125 +0000 UTC m=+289.950268732" watchObservedRunningTime="2025-10-01 13:41:58.064900149 +0000 UTC m=+289.954530756" Oct 01 13:41:58 crc kubenswrapper[4774]: I1001 13:41:58.086695 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrzqd" podStartSLOduration=2.240061452 podStartE2EDuration="6.08666872s" podCreationTimestamp="2025-10-01 13:41:52 +0000 UTC" firstStartedPulling="2025-10-01 13:41:54.004072405 +0000 UTC m=+285.893703002" lastFinishedPulling="2025-10-01 13:41:57.850679673 +0000 UTC m=+289.740310270" observedRunningTime="2025-10-01 13:41:58.086443654 +0000 UTC m=+289.976074271" watchObservedRunningTime="2025-10-01 13:41:58.08666872 +0000 UTC m=+289.976299357" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.206432 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.206778 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.273875 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.404039 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.404343 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:42:00 crc kubenswrapper[4774]: I1001 13:42:00.456311 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:42:01 crc kubenswrapper[4774]: I1001 13:42:01.109988 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pm5fh" Oct 01 13:42:01 crc kubenswrapper[4774]: I1001 13:42:01.115589 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c9bgd" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.632500 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.632941 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.677147 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.888941 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.889342 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:42:02 crc kubenswrapper[4774]: I1001 13:42:02.922957 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:42:03 crc kubenswrapper[4774]: I1001 13:42:03.135205 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cvpbg" Oct 01 13:42:03 crc kubenswrapper[4774]: I1001 13:42:03.147430 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrzqd" Oct 01 13:42:37 crc kubenswrapper[4774]: I1001 13:42:37.271369 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:42:37 crc kubenswrapper[4774]: I1001 13:42:37.271998 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:07 crc kubenswrapper[4774]: I1001 13:43:07.271039 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:43:07 crc kubenswrapper[4774]: I1001 13:43:07.271650 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.271049 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.271916 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.271989 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.273097 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.273218 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db" gracePeriod=600 Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.648330 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db" exitCode=0 Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.648392 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db"} Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.648442 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660"} Oct 01 13:43:37 crc kubenswrapper[4774]: I1001 13:43:37.648498 4774 scope.go:117] "RemoveContainer" containerID="98fb3b8c7ff4e6f48d8f0c13430f3fbbb1f5752dfd769ba4472f953f13797c38" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.154612 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j"] Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.158531 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.164358 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j"] Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.167116 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.167178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.260337 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.260627 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.260799 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q8kn\" (UniqueName: \"kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.362732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q8kn\" (UniqueName: \"kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.362850 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.362884 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.364268 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.370638 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.382731 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q8kn\" (UniqueName: \"kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn\") pod \"collect-profiles-29322105-mzd2j\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.493431 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:00 crc kubenswrapper[4774]: I1001 13:45:00.902188 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j"] Oct 01 13:45:01 crc kubenswrapper[4774]: I1001 13:45:01.243826 4774 generic.go:334] "Generic (PLEG): container finished" podID="63f53406-b105-4266-97d6-9b886f9052f9" containerID="ea5ed2b8e5b7d793693f4dfd6bb34dd0664474e1c7eed629dc6cc3691dc3748a" exitCode=0 Oct 01 13:45:01 crc kubenswrapper[4774]: I1001 13:45:01.244143 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" event={"ID":"63f53406-b105-4266-97d6-9b886f9052f9","Type":"ContainerDied","Data":"ea5ed2b8e5b7d793693f4dfd6bb34dd0664474e1c7eed629dc6cc3691dc3748a"} Oct 01 13:45:01 crc kubenswrapper[4774]: I1001 13:45:01.244174 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" event={"ID":"63f53406-b105-4266-97d6-9b886f9052f9","Type":"ContainerStarted","Data":"5a7438191b4b47b3930555f34237abc2ab58f4e891766f20b27b6e1ebaad37ff"} Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.530405 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.696255 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q8kn\" (UniqueName: \"kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn\") pod \"63f53406-b105-4266-97d6-9b886f9052f9\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.696303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume\") pod \"63f53406-b105-4266-97d6-9b886f9052f9\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.696409 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume\") pod \"63f53406-b105-4266-97d6-9b886f9052f9\" (UID: \"63f53406-b105-4266-97d6-9b886f9052f9\") " Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.698015 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "63f53406-b105-4266-97d6-9b886f9052f9" (UID: "63f53406-b105-4266-97d6-9b886f9052f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.701715 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn" (OuterVolumeSpecName: "kube-api-access-9q8kn") pod "63f53406-b105-4266-97d6-9b886f9052f9" (UID: "63f53406-b105-4266-97d6-9b886f9052f9"). InnerVolumeSpecName "kube-api-access-9q8kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.701875 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "63f53406-b105-4266-97d6-9b886f9052f9" (UID: "63f53406-b105-4266-97d6-9b886f9052f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.797922 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q8kn\" (UniqueName: \"kubernetes.io/projected/63f53406-b105-4266-97d6-9b886f9052f9-kube-api-access-9q8kn\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.798231 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/63f53406-b105-4266-97d6-9b886f9052f9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:02 crc kubenswrapper[4774]: I1001 13:45:02.798246 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63f53406-b105-4266-97d6-9b886f9052f9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4774]: I1001 13:45:03.257575 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" event={"ID":"63f53406-b105-4266-97d6-9b886f9052f9","Type":"ContainerDied","Data":"5a7438191b4b47b3930555f34237abc2ab58f4e891766f20b27b6e1ebaad37ff"} Oct 01 13:45:03 crc kubenswrapper[4774]: I1001 13:45:03.257833 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7438191b4b47b3930555f34237abc2ab58f4e891766f20b27b6e1ebaad37ff" Oct 01 13:45:03 crc kubenswrapper[4774]: I1001 13:45:03.257691 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mzd2j" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.734256 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7v4t6"] Oct 01 13:45:11 crc kubenswrapper[4774]: E1001 13:45:11.735028 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f53406-b105-4266-97d6-9b886f9052f9" containerName="collect-profiles" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.735050 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f53406-b105-4266-97d6-9b886f9052f9" containerName="collect-profiles" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.735250 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f53406-b105-4266-97d6-9b886f9052f9" containerName="collect-profiles" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.735806 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.759192 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7v4t6"] Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836291 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836338 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-registry-certificates\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836361 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d2e5068-1118-446d-b53b-c65cba67675b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836385 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-bound-sa-token\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836400 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6nzx\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-kube-api-access-w6nzx\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836523 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d2e5068-1118-446d-b53b-c65cba67675b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836558 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-registry-tls\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.836578 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-trusted-ca\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.859970 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.937699 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d2e5068-1118-446d-b53b-c65cba67675b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.937957 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-registry-tls\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.937980 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-trusted-ca\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.938017 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-registry-certificates\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.938046 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d2e5068-1118-446d-b53b-c65cba67675b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.938075 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-bound-sa-token\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.938097 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6nzx\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-kube-api-access-w6nzx\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.938548 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6d2e5068-1118-446d-b53b-c65cba67675b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.939222 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-trusted-ca\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.939338 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6d2e5068-1118-446d-b53b-c65cba67675b-registry-certificates\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.943393 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6d2e5068-1118-446d-b53b-c65cba67675b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.947075 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-registry-tls\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.953577 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-bound-sa-token\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:11 crc kubenswrapper[4774]: I1001 13:45:11.954686 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6nzx\" (UniqueName: \"kubernetes.io/projected/6d2e5068-1118-446d-b53b-c65cba67675b-kube-api-access-w6nzx\") pod \"image-registry-66df7c8f76-7v4t6\" (UID: \"6d2e5068-1118-446d-b53b-c65cba67675b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:12 crc kubenswrapper[4774]: I1001 13:45:12.056112 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:12 crc kubenswrapper[4774]: I1001 13:45:12.576095 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7v4t6"] Oct 01 13:45:13 crc kubenswrapper[4774]: I1001 13:45:13.336109 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" event={"ID":"6d2e5068-1118-446d-b53b-c65cba67675b","Type":"ContainerStarted","Data":"78522125204a734428f9f8edc3a5cc80706f61082d2f7b62e6b60a903d3cb5a9"} Oct 01 13:45:13 crc kubenswrapper[4774]: I1001 13:45:13.336521 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:13 crc kubenswrapper[4774]: I1001 13:45:13.336538 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" event={"ID":"6d2e5068-1118-446d-b53b-c65cba67675b","Type":"ContainerStarted","Data":"c13c032e43ccec1f72efd7ecfac9cdb660c83f7aba514e32c73309977ec5926a"} Oct 01 13:45:13 crc kubenswrapper[4774]: I1001 13:45:13.358780 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" podStartSLOduration=2.358753145 podStartE2EDuration="2.358753145s" podCreationTimestamp="2025-10-01 13:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:45:13.357996904 +0000 UTC m=+485.247627581" watchObservedRunningTime="2025-10-01 13:45:13.358753145 +0000 UTC m=+485.248383782" Oct 01 13:45:32 crc kubenswrapper[4774]: I1001 13:45:32.064905 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7v4t6" Oct 01 13:45:32 crc kubenswrapper[4774]: I1001 13:45:32.172993 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:45:37 crc kubenswrapper[4774]: I1001 13:45:37.271272 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:45:37 crc kubenswrapper[4774]: I1001 13:45:37.272650 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.218167 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" podUID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" containerName="registry" containerID="cri-o://05c0e75da268172c12d27baef0368a761d6cdc7660bb265c79e1486c3bdd457e" gracePeriod=30 Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.635417 4774 generic.go:334] "Generic (PLEG): container finished" podID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" containerID="05c0e75da268172c12d27baef0368a761d6cdc7660bb265c79e1486c3bdd457e" exitCode=0 Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.635601 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" event={"ID":"82f9c75f-2ec4-4089-88b1-0bb1ba287f16","Type":"ContainerDied","Data":"05c0e75da268172c12d27baef0368a761d6cdc7660bb265c79e1486c3bdd457e"} Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.635897 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" event={"ID":"82f9c75f-2ec4-4089-88b1-0bb1ba287f16","Type":"ContainerDied","Data":"fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db"} Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.635964 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0783984cc7dcc481abf7afde0858cc14ea1188e82b072bc7244382a4cea8db" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.642295 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.669889 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670001 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92ggd\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670105 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670189 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670273 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670349 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670389 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.670792 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\" (UID: \"82f9c75f-2ec4-4089-88b1-0bb1ba287f16\") " Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.671803 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.682384 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.683690 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd" (OuterVolumeSpecName: "kube-api-access-92ggd") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "kube-api-access-92ggd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.684350 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.687658 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.691947 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.701070 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.706750 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "82f9c75f-2ec4-4089-88b1-0bb1ba287f16" (UID: "82f9c75f-2ec4-4089-88b1-0bb1ba287f16"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773504 4774 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773576 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92ggd\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-kube-api-access-92ggd\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773595 4774 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773610 4774 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773658 4774 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773675 4774 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:57 crc kubenswrapper[4774]: I1001 13:45:57.773690 4774 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/82f9c75f-2ec4-4089-88b1-0bb1ba287f16-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:58 crc kubenswrapper[4774]: I1001 13:45:58.642497 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xcl4x" Oct 01 13:45:58 crc kubenswrapper[4774]: I1001 13:45:58.679942 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:45:58 crc kubenswrapper[4774]: I1001 13:45:58.682108 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xcl4x"] Oct 01 13:45:58 crc kubenswrapper[4774]: I1001 13:45:58.882995 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" path="/var/lib/kubelet/pods/82f9c75f-2ec4-4089-88b1-0bb1ba287f16/volumes" Oct 01 13:46:07 crc kubenswrapper[4774]: I1001 13:46:07.271155 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:46:07 crc kubenswrapper[4774]: I1001 13:46:07.271729 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:46:09 crc kubenswrapper[4774]: I1001 13:46:09.046100 4774 scope.go:117] "RemoveContainer" containerID="05c0e75da268172c12d27baef0368a761d6cdc7660bb265c79e1486c3bdd457e" Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.271592 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.272429 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.272568 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.273619 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.273707 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660" gracePeriod=600 Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.928060 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660" exitCode=0 Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.928137 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660"} Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.928969 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917"} Oct 01 13:46:37 crc kubenswrapper[4774]: I1001 13:46:37.929018 4774 scope.go:117] "RemoveContainer" containerID="45fe67a4b4a9a83e2ae4bc8f080985498cc6ec39fabf36b93a55613990ba18db" Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.897651 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v7jfr"] Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.899557 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-controller" containerID="cri-o://46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.899806 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="northd" containerID="cri-o://44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.899941 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="sbdb" containerID="cri-o://7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.900043 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-acl-logging" containerID="cri-o://0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.900102 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.900154 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-node" containerID="cri-o://889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.899592 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="nbdb" containerID="cri-o://94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" gracePeriod=30 Oct 01 13:48:32 crc kubenswrapper[4774]: I1001 13:48:32.982898 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" containerID="cri-o://06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" gracePeriod=30 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.680660 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/3.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.684374 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovn-acl-logging/0.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.685589 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovn-controller/0.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.686244 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.712107 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/2.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.712942 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/1.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.713004 4774 generic.go:334] "Generic (PLEG): container finished" podID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" containerID="e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09" exitCode=2 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.713084 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerDied","Data":"e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.713119 4774 scope.go:117] "RemoveContainer" containerID="8fbd8e52fc7f9ffac8bf8926f8e2ca24cfa639776803f018baf6fd267ade952e" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.713663 4774 scope.go:117] "RemoveContainer" containerID="e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.713872 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8svls_openshift-multus(be8a0f8f-0098-4fa6-b4b2-ceda580f19b5)\"" pod="openshift-multus/multus-8svls" podUID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.719181 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovnkube-controller/3.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.722603 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovn-acl-logging/0.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723248 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-v7jfr_e3ee3cb3-6187-468f-9b58-60a18ef2da67/ovn-controller/0.log" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723687 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723725 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723739 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723752 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723765 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723777 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" exitCode=0 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723789 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" exitCode=143 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723802 4774 generic.go:334] "Generic (PLEG): container finished" podID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" exitCode=143 Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723869 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723917 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723964 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723981 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.723999 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724016 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724028 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724042 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724054 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724065 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724075 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724087 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724098 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724108 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724127 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724142 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724154 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724165 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724175 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724185 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724195 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724206 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724217 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724228 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724238 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724253 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724267 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724280 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724291 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724301 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724311 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724322 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724333 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724343 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724354 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724365 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724381 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" event={"ID":"e3ee3cb3-6187-468f-9b58-60a18ef2da67","Type":"ContainerDied","Data":"a3b86086d7245e27b984358968c2142debea2cd6b6c0209b0196de96dca863d0"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724402 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724417 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724430 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724444 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724491 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724507 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724520 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724534 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724547 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724560 4774 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.724714 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-v7jfr" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.762730 4774 scope.go:117] "RemoveContainer" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.765835 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hv4k6"] Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766010 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="northd" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766027 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="northd" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766072 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="sbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766078 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="sbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766085 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766091 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766099 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kubecfg-setup" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766104 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kubecfg-setup" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766115 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766122 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766131 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-acl-logging" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766137 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-acl-logging" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766146 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766152 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766160 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766166 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766173 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-node" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766178 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-node" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766186 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766192 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766200 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" containerName="registry" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766206 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" containerName="registry" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766216 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="nbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766222 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="nbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766300 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766308 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766315 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="sbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766323 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766335 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovn-acl-logging" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766345 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766352 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f9c75f-2ec4-4089-88b1-0bb1ba287f16" containerName="registry" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766360 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="northd" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766370 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="nbdb" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766377 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766387 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="kube-rbac-proxy-node" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766520 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766530 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.766540 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766546 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766616 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.766627 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" containerName="ovnkube-controller" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.768940 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.787414 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.818077 4774 scope.go:117] "RemoveContainer" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.832079 4774 scope.go:117] "RemoveContainer" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835672 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835719 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835742 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835767 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835782 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835841 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835862 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835896 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835914 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t7th\" (UniqueName: \"kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835933 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835945 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835963 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835979 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.835994 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836009 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836030 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836051 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836066 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836080 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units\") pod \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\" (UID: \"e3ee3cb3-6187-468f-9b58-60a18ef2da67\") " Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836358 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash" (OuterVolumeSpecName: "host-slash") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836439 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836523 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836495 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log" (OuterVolumeSpecName: "node-log") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836511 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket" (OuterVolumeSpecName: "log-socket") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836562 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836554 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836593 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836606 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836688 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836738 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836787 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.836866 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.837026 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.837271 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.837499 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.837524 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.841670 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th" (OuterVolumeSpecName: "kube-api-access-8t7th") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "kube-api-access-8t7th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.841973 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.848694 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e3ee3cb3-6187-468f-9b58-60a18ef2da67" (UID: "e3ee3cb3-6187-468f-9b58-60a18ef2da67"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.860818 4774 scope.go:117] "RemoveContainer" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.875439 4774 scope.go:117] "RemoveContainer" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.889771 4774 scope.go:117] "RemoveContainer" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.908726 4774 scope.go:117] "RemoveContainer" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.922488 4774 scope.go:117] "RemoveContainer" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.936706 4774 scope.go:117] "RemoveContainer" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937131 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-log-socket\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937181 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-node-log\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937209 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-env-overrides\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937231 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-netd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937251 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-netns\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937275 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-slash\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937299 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937320 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-var-lib-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-config\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937361 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-script-lib\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937383 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-kubelet\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937402 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937421 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovn-node-metrics-cert\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937441 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-systemd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937504 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-bin\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937601 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937626 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-systemd-units\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937662 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-ovn\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937705 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm7bs\" (UniqueName: \"kubernetes.io/projected/29d15a14-6a96-472c-bd6b-f1be65afff3f-kube-api-access-qm7bs\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-etc-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937815 4774 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937837 4774 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937853 4774 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937868 4774 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.937884 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t7th\" (UniqueName: \"kubernetes.io/projected/e3ee3cb3-6187-468f-9b58-60a18ef2da67-kube-api-access-8t7th\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938292 4774 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938312 4774 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938321 4774 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938330 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938340 4774 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938348 4774 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938356 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938366 4774 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938374 4774 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938382 4774 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938391 4774 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938399 4774 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3ee3cb3-6187-468f-9b58-60a18ef2da67-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938409 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938418 4774 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.938426 4774 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3ee3cb3-6187-468f-9b58-60a18ef2da67-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.951420 4774 scope.go:117] "RemoveContainer" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.951972 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": container with ID starting with 06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8 not found: ID does not exist" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.952004 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} err="failed to get container status \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": rpc error: code = NotFound desc = could not find container \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": container with ID starting with 06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.952032 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.952805 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": container with ID starting with b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223 not found: ID does not exist" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.952930 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} err="failed to get container status \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": rpc error: code = NotFound desc = could not find container \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": container with ID starting with b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.952972 4774 scope.go:117] "RemoveContainer" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.953404 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": container with ID starting with 7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574 not found: ID does not exist" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.953443 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} err="failed to get container status \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": rpc error: code = NotFound desc = could not find container \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": container with ID starting with 7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.953490 4774 scope.go:117] "RemoveContainer" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.953852 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": container with ID starting with 94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b not found: ID does not exist" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.953897 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} err="failed to get container status \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": rpc error: code = NotFound desc = could not find container \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": container with ID starting with 94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.953919 4774 scope.go:117] "RemoveContainer" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.954363 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": container with ID starting with 44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52 not found: ID does not exist" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.954389 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} err="failed to get container status \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": rpc error: code = NotFound desc = could not find container \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": container with ID starting with 44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.954410 4774 scope.go:117] "RemoveContainer" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.954893 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": container with ID starting with 4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae not found: ID does not exist" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.954936 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} err="failed to get container status \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": rpc error: code = NotFound desc = could not find container \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": container with ID starting with 4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.954959 4774 scope.go:117] "RemoveContainer" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.955661 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": container with ID starting with 889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be not found: ID does not exist" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.955703 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} err="failed to get container status \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": rpc error: code = NotFound desc = could not find container \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": container with ID starting with 889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.955725 4774 scope.go:117] "RemoveContainer" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.956034 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": container with ID starting with 0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7 not found: ID does not exist" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956062 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} err="failed to get container status \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": rpc error: code = NotFound desc = could not find container \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": container with ID starting with 0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956079 4774 scope.go:117] "RemoveContainer" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.956372 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": container with ID starting with 46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79 not found: ID does not exist" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956420 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} err="failed to get container status \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": rpc error: code = NotFound desc = could not find container \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": container with ID starting with 46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956440 4774 scope.go:117] "RemoveContainer" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: E1001 13:48:33.956708 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": container with ID starting with 425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d not found: ID does not exist" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956728 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} err="failed to get container status \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": rpc error: code = NotFound desc = could not find container \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": container with ID starting with 425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.956740 4774 scope.go:117] "RemoveContainer" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957111 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} err="failed to get container status \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": rpc error: code = NotFound desc = could not find container \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": container with ID starting with 06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957167 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957443 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} err="failed to get container status \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": rpc error: code = NotFound desc = could not find container \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": container with ID starting with b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957483 4774 scope.go:117] "RemoveContainer" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957732 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} err="failed to get container status \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": rpc error: code = NotFound desc = could not find container \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": container with ID starting with 7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.957782 4774 scope.go:117] "RemoveContainer" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958124 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} err="failed to get container status \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": rpc error: code = NotFound desc = could not find container \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": container with ID starting with 94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958142 4774 scope.go:117] "RemoveContainer" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958393 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} err="failed to get container status \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": rpc error: code = NotFound desc = could not find container \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": container with ID starting with 44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958408 4774 scope.go:117] "RemoveContainer" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958773 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} err="failed to get container status \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": rpc error: code = NotFound desc = could not find container \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": container with ID starting with 4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.958804 4774 scope.go:117] "RemoveContainer" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959081 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} err="failed to get container status \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": rpc error: code = NotFound desc = could not find container \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": container with ID starting with 889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959100 4774 scope.go:117] "RemoveContainer" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959374 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} err="failed to get container status \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": rpc error: code = NotFound desc = could not find container \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": container with ID starting with 0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959393 4774 scope.go:117] "RemoveContainer" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959684 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} err="failed to get container status \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": rpc error: code = NotFound desc = could not find container \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": container with ID starting with 46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959711 4774 scope.go:117] "RemoveContainer" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959961 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} err="failed to get container status \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": rpc error: code = NotFound desc = could not find container \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": container with ID starting with 425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.959997 4774 scope.go:117] "RemoveContainer" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960253 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} err="failed to get container status \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": rpc error: code = NotFound desc = could not find container \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": container with ID starting with 06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960272 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960514 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} err="failed to get container status \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": rpc error: code = NotFound desc = could not find container \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": container with ID starting with b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960533 4774 scope.go:117] "RemoveContainer" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960729 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} err="failed to get container status \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": rpc error: code = NotFound desc = could not find container \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": container with ID starting with 7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960753 4774 scope.go:117] "RemoveContainer" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960938 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} err="failed to get container status \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": rpc error: code = NotFound desc = could not find container \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": container with ID starting with 94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.960967 4774 scope.go:117] "RemoveContainer" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.961179 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} err="failed to get container status \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": rpc error: code = NotFound desc = could not find container \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": container with ID starting with 44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.961197 4774 scope.go:117] "RemoveContainer" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.961651 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} err="failed to get container status \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": rpc error: code = NotFound desc = could not find container \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": container with ID starting with 4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.961669 4774 scope.go:117] "RemoveContainer" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.961969 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} err="failed to get container status \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": rpc error: code = NotFound desc = could not find container \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": container with ID starting with 889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962004 4774 scope.go:117] "RemoveContainer" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962254 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} err="failed to get container status \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": rpc error: code = NotFound desc = could not find container \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": container with ID starting with 0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962289 4774 scope.go:117] "RemoveContainer" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962630 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} err="failed to get container status \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": rpc error: code = NotFound desc = could not find container \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": container with ID starting with 46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962653 4774 scope.go:117] "RemoveContainer" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.962998 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} err="failed to get container status \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": rpc error: code = NotFound desc = could not find container \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": container with ID starting with 425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.963017 4774 scope.go:117] "RemoveContainer" containerID="06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.963308 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8"} err="failed to get container status \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": rpc error: code = NotFound desc = could not find container \"06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8\": container with ID starting with 06fdfb38c63e6bb30b7096622a1ab1fcffeddf109b27ff145706ac8513e81fe8 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.963328 4774 scope.go:117] "RemoveContainer" containerID="b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.963753 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223"} err="failed to get container status \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": rpc error: code = NotFound desc = could not find container \"b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223\": container with ID starting with b3005433624a5e70bcb539667720b014b62abcf5f5643c0881c171dad5a53223 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.963781 4774 scope.go:117] "RemoveContainer" containerID="7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964049 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574"} err="failed to get container status \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": rpc error: code = NotFound desc = could not find container \"7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574\": container with ID starting with 7574ae776795e189b5b305d181d42fb5f08a9aeba52b8ce70b80b06e61884574 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964071 4774 scope.go:117] "RemoveContainer" containerID="94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964324 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b"} err="failed to get container status \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": rpc error: code = NotFound desc = could not find container \"94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b\": container with ID starting with 94cba57971693ce1e8433f1de88a82f4d430c094d76034c6739a623f741a7a0b not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964345 4774 scope.go:117] "RemoveContainer" containerID="44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964655 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52"} err="failed to get container status \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": rpc error: code = NotFound desc = could not find container \"44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52\": container with ID starting with 44ffac8ee8c3339133ff728a848bc6e027c65db05604921351c2acaa57ed6e52 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964674 4774 scope.go:117] "RemoveContainer" containerID="4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964920 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae"} err="failed to get container status \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": rpc error: code = NotFound desc = could not find container \"4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae\": container with ID starting with 4061e5ad510a89bea35ae6bece485c38a7f6e2e1e01a6b47678d8257a5fdd0ae not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.964937 4774 scope.go:117] "RemoveContainer" containerID="889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965187 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be"} err="failed to get container status \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": rpc error: code = NotFound desc = could not find container \"889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be\": container with ID starting with 889aa3717a09c902fc4c5458ac234e0a8e5fde34c322c6bb02c4fa29467a77be not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965204 4774 scope.go:117] "RemoveContainer" containerID="0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965474 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7"} err="failed to get container status \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": rpc error: code = NotFound desc = could not find container \"0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7\": container with ID starting with 0174ab414e858777e7b7efed67e36052cad16fad7924eab5c091482f9f5dc6a7 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965497 4774 scope.go:117] "RemoveContainer" containerID="46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965702 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79"} err="failed to get container status \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": rpc error: code = NotFound desc = could not find container \"46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79\": container with ID starting with 46c7339b3b5bc4386c782341dd2fc1137ea7d28e80daad83e5cff9451ebbbf79 not found: ID does not exist" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965717 4774 scope.go:117] "RemoveContainer" containerID="425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d" Oct 01 13:48:33 crc kubenswrapper[4774]: I1001 13:48:33.965911 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d"} err="failed to get container status \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": rpc error: code = NotFound desc = could not find container \"425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d\": container with ID starting with 425ff1f558123081ac39d35d2f3a5c95d993ace6927c3aa53a49400409e5423d not found: ID does not exist" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039667 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-node-log\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039701 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-env-overrides\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039718 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-netns\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039733 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-netd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039761 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-slash\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039779 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039783 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-netns\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039813 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-var-lib-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039793 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-var-lib-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-netd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039757 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-node-log\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-slash\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039854 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-config\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039942 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.039963 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-script-lib\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040065 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-kubelet\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040082 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040095 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovn-node-metrics-cert\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040108 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-systemd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040144 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-bin\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040176 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-systemd-units\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040193 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-ovn\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040215 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm7bs\" (UniqueName: \"kubernetes.io/projected/29d15a14-6a96-472c-bd6b-f1be65afff3f-kube-api-access-qm7bs\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040229 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-etc-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040260 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-log-socket\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040305 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-log-socket\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040523 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-cni-bin\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040552 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-kubelet\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040573 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040623 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-env-overrides\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040711 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040757 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-systemd\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040793 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-script-lib\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040824 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-run-ovn\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040843 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-etc-openvswitch\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040857 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/29d15a14-6a96-472c-bd6b-f1be65afff3f-systemd-units\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.040883 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovnkube-config\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.044824 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29d15a14-6a96-472c-bd6b-f1be65afff3f-ovn-node-metrics-cert\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.058401 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v7jfr"] Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.061297 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm7bs\" (UniqueName: \"kubernetes.io/projected/29d15a14-6a96-472c-bd6b-f1be65afff3f-kube-api-access-qm7bs\") pod \"ovnkube-node-hv4k6\" (UID: \"29d15a14-6a96-472c-bd6b-f1be65afff3f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.063079 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-v7jfr"] Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.095534 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.735643 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/2.log" Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.739274 4774 generic.go:334] "Generic (PLEG): container finished" podID="29d15a14-6a96-472c-bd6b-f1be65afff3f" containerID="5a816c697dba49d9b8429206e9c21bab8a86edf49a83d24827a964c867011f49" exitCode=0 Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.739354 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerDied","Data":"5a816c697dba49d9b8429206e9c21bab8a86edf49a83d24827a964c867011f49"} Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.739509 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"2c55f1be61de9c8d514ce8b3b30d62a4ffc752ec54b53d563a5b39ae0eb9a51d"} Oct 01 13:48:34 crc kubenswrapper[4774]: I1001 13:48:34.885957 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ee3cb3-6187-468f-9b58-60a18ef2da67" path="/var/lib/kubelet/pods/e3ee3cb3-6187-468f-9b58-60a18ef2da67/volumes" Oct 01 13:48:35 crc kubenswrapper[4774]: I1001 13:48:35.754184 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"d98e1cec488207a9e30258538aa434ce25619e33b56b2adc5f1a407df1b39dcc"} Oct 01 13:48:35 crc kubenswrapper[4774]: I1001 13:48:35.754615 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"a60124473156d5dceec62d25267006e834f46a5d84656aba75c9a1d319f84d8b"} Oct 01 13:48:35 crc kubenswrapper[4774]: I1001 13:48:35.754637 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"f32b782644f078390aafae6ce681d7235f08c594c6467cd110bf2010d59fd55e"} Oct 01 13:48:35 crc kubenswrapper[4774]: I1001 13:48:35.754654 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"9386c25888ab158341e2d369844028ecc8908b647f7466435999bc36158fd2cd"} Oct 01 13:48:35 crc kubenswrapper[4774]: I1001 13:48:35.754673 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"bcdd4c0cf4663bfd923008b82aa1626aee77b74c7f74d8d381bce6aced89eed9"} Oct 01 13:48:36 crc kubenswrapper[4774]: I1001 13:48:36.778788 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"0b902b2aa4c824f4edb77140bd18fefe0a33022e4d2e5eaa76d2fa10cfb7e34e"} Oct 01 13:48:37 crc kubenswrapper[4774]: I1001 13:48:37.270590 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:48:37 crc kubenswrapper[4774]: I1001 13:48:37.270686 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:48:38 crc kubenswrapper[4774]: I1001 13:48:38.801197 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"b6e09dbbdef28f1a0859c2ba3765fd2ae78c11368e942907d695329c0194ecbb"} Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.819222 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" event={"ID":"29d15a14-6a96-472c-bd6b-f1be65afff3f","Type":"ContainerStarted","Data":"c7b3c6c2224fb2692241f1ef100d18256239f7e5930ffb7bf1357c667de3da71"} Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.820605 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.820654 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.820669 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.850529 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" podStartSLOduration=7.850507344 podStartE2EDuration="7.850507344s" podCreationTimestamp="2025-10-01 13:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:48:40.847761792 +0000 UTC m=+692.737392429" watchObservedRunningTime="2025-10-01 13:48:40.850507344 +0000 UTC m=+692.740137991" Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.854744 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:40 crc kubenswrapper[4774]: I1001 13:48:40.855957 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:48:47 crc kubenswrapper[4774]: I1001 13:48:47.871895 4774 scope.go:117] "RemoveContainer" containerID="e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09" Oct 01 13:48:47 crc kubenswrapper[4774]: E1001 13:48:47.872828 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8svls_openshift-multus(be8a0f8f-0098-4fa6-b4b2-ceda580f19b5)\"" pod="openshift-multus/multus-8svls" podUID="be8a0f8f-0098-4fa6-b4b2-ceda580f19b5" Oct 01 13:48:58 crc kubenswrapper[4774]: I1001 13:48:58.875101 4774 scope.go:117] "RemoveContainer" containerID="e564739acfde2ac5595724369cd4bac33083d339207bb468d7886b72ecf7cb09" Oct 01 13:48:59 crc kubenswrapper[4774]: I1001 13:48:59.953403 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svls_be8a0f8f-0098-4fa6-b4b2-ceda580f19b5/kube-multus/2.log" Oct 01 13:48:59 crc kubenswrapper[4774]: I1001 13:48:59.953884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svls" event={"ID":"be8a0f8f-0098-4fa6-b4b2-ceda580f19b5","Type":"ContainerStarted","Data":"52f43b53aff30a12496786d7d4deb6d1294c3b461153db75daba17ddb4762e9d"} Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.274570 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.275544 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.279743 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-ptc4f" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.279894 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.280478 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.305877 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.366538 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6wtz\" (UniqueName: \"kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz\") pod \"mariadb-operator-index-lswck\" (UID: \"be5235e6-155d-4e2d-b259-ac9cdd95ace0\") " pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.468515 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6wtz\" (UniqueName: \"kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz\") pod \"mariadb-operator-index-lswck\" (UID: \"be5235e6-155d-4e2d-b259-ac9cdd95ace0\") " pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.486555 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6wtz\" (UniqueName: \"kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz\") pod \"mariadb-operator-index-lswck\" (UID: \"be5235e6-155d-4e2d-b259-ac9cdd95ace0\") " pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.597017 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.840695 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.853245 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:49:02 crc kubenswrapper[4774]: I1001 13:49:02.971667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lswck" event={"ID":"be5235e6-155d-4e2d-b259-ac9cdd95ace0","Type":"ContainerStarted","Data":"ba12c3ba53e6f8f242c052a670afe8ba0192ac0f113ae637007540dc6d2ac993"} Oct 01 13:49:03 crc kubenswrapper[4774]: I1001 13:49:03.980661 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lswck" event={"ID":"be5235e6-155d-4e2d-b259-ac9cdd95ace0","Type":"ContainerStarted","Data":"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1"} Oct 01 13:49:04 crc kubenswrapper[4774]: I1001 13:49:04.007934 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-lswck" podStartSLOduration=1.207996282 podStartE2EDuration="2.007910806s" podCreationTimestamp="2025-10-01 13:49:02 +0000 UTC" firstStartedPulling="2025-10-01 13:49:02.852957944 +0000 UTC m=+714.742588541" lastFinishedPulling="2025-10-01 13:49:03.652872468 +0000 UTC m=+715.542503065" observedRunningTime="2025-10-01 13:49:04.000580347 +0000 UTC m=+715.890211014" watchObservedRunningTime="2025-10-01 13:49:04.007910806 +0000 UTC m=+715.897541443" Oct 01 13:49:04 crc kubenswrapper[4774]: I1001 13:49:04.135371 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hv4k6" Oct 01 13:49:05 crc kubenswrapper[4774]: I1001 13:49:05.232776 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:05 crc kubenswrapper[4774]: I1001 13:49:05.839008 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-2wbpw"] Oct 01 13:49:05 crc kubenswrapper[4774]: I1001 13:49:05.840058 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:05 crc kubenswrapper[4774]: I1001 13:49:05.855374 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-2wbpw"] Oct 01 13:49:05 crc kubenswrapper[4774]: I1001 13:49:05.994990 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-lswck" podUID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" containerName="registry-server" containerID="cri-o://b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1" gracePeriod=2 Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.012628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jzv5\" (UniqueName: \"kubernetes.io/projected/e2809b6a-b3bf-475f-8d9c-1f8609109e17-kube-api-access-7jzv5\") pod \"mariadb-operator-index-2wbpw\" (UID: \"e2809b6a-b3bf-475f-8d9c-1f8609109e17\") " pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.113678 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jzv5\" (UniqueName: \"kubernetes.io/projected/e2809b6a-b3bf-475f-8d9c-1f8609109e17-kube-api-access-7jzv5\") pod \"mariadb-operator-index-2wbpw\" (UID: \"e2809b6a-b3bf-475f-8d9c-1f8609109e17\") " pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.150655 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jzv5\" (UniqueName: \"kubernetes.io/projected/e2809b6a-b3bf-475f-8d9c-1f8609109e17-kube-api-access-7jzv5\") pod \"mariadb-operator-index-2wbpw\" (UID: \"e2809b6a-b3bf-475f-8d9c-1f8609109e17\") " pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.168165 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.439862 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.440418 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-2wbpw"] Oct 01 13:49:06 crc kubenswrapper[4774]: W1001 13:49:06.460719 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2809b6a_b3bf_475f_8d9c_1f8609109e17.slice/crio-3ae97a01d1fd7d69082d20d5865ef0472f89b54172fce333b2d3caa931bc20c3 WatchSource:0}: Error finding container 3ae97a01d1fd7d69082d20d5865ef0472f89b54172fce333b2d3caa931bc20c3: Status 404 returned error can't find the container with id 3ae97a01d1fd7d69082d20d5865ef0472f89b54172fce333b2d3caa931bc20c3 Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.619751 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6wtz\" (UniqueName: \"kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz\") pod \"be5235e6-155d-4e2d-b259-ac9cdd95ace0\" (UID: \"be5235e6-155d-4e2d-b259-ac9cdd95ace0\") " Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.625027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz" (OuterVolumeSpecName: "kube-api-access-p6wtz") pod "be5235e6-155d-4e2d-b259-ac9cdd95ace0" (UID: "be5235e6-155d-4e2d-b259-ac9cdd95ace0"). InnerVolumeSpecName "kube-api-access-p6wtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:06 crc kubenswrapper[4774]: I1001 13:49:06.722084 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6wtz\" (UniqueName: \"kubernetes.io/projected/be5235e6-155d-4e2d-b259-ac9cdd95ace0-kube-api-access-p6wtz\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.003988 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2wbpw" event={"ID":"e2809b6a-b3bf-475f-8d9c-1f8609109e17","Type":"ContainerStarted","Data":"3ae97a01d1fd7d69082d20d5865ef0472f89b54172fce333b2d3caa931bc20c3"} Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.007066 4774 generic.go:334] "Generic (PLEG): container finished" podID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" containerID="b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1" exitCode=0 Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.007128 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lswck" event={"ID":"be5235e6-155d-4e2d-b259-ac9cdd95ace0","Type":"ContainerDied","Data":"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1"} Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.007162 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-lswck" Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.007178 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-lswck" event={"ID":"be5235e6-155d-4e2d-b259-ac9cdd95ace0","Type":"ContainerDied","Data":"ba12c3ba53e6f8f242c052a670afe8ba0192ac0f113ae637007540dc6d2ac993"} Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.007209 4774 scope.go:117] "RemoveContainer" containerID="b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1" Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.066773 4774 scope.go:117] "RemoveContainer" containerID="b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1" Oct 01 13:49:07 crc kubenswrapper[4774]: E1001 13:49:07.067100 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1\": container with ID starting with b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1 not found: ID does not exist" containerID="b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1" Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.067135 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1"} err="failed to get container status \"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1\": rpc error: code = NotFound desc = could not find container \"b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1\": container with ID starting with b7bab93a9e5f407eb7443c0c6f06484d465167b6e8f25e5546e7165a8cbfd0c1 not found: ID does not exist" Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.067516 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.086296 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-lswck"] Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.271480 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:49:07 crc kubenswrapper[4774]: I1001 13:49:07.271865 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:49:08 crc kubenswrapper[4774]: I1001 13:49:08.017321 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-2wbpw" event={"ID":"e2809b6a-b3bf-475f-8d9c-1f8609109e17","Type":"ContainerStarted","Data":"43dfbc354ec1dfe3c7e6707a45c6bc3a37d52dbae97445131475ab1651f77d61"} Oct 01 13:49:08 crc kubenswrapper[4774]: I1001 13:49:08.043067 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-2wbpw" podStartSLOduration=2.614588281 podStartE2EDuration="3.042947502s" podCreationTimestamp="2025-10-01 13:49:05 +0000 UTC" firstStartedPulling="2025-10-01 13:49:06.466548316 +0000 UTC m=+718.356178923" lastFinishedPulling="2025-10-01 13:49:06.894907517 +0000 UTC m=+718.784538144" observedRunningTime="2025-10-01 13:49:08.042548541 +0000 UTC m=+719.932179218" watchObservedRunningTime="2025-10-01 13:49:08.042947502 +0000 UTC m=+719.932578169" Oct 01 13:49:08 crc kubenswrapper[4774]: I1001 13:49:08.882919 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" path="/var/lib/kubelet/pods/be5235e6-155d-4e2d-b259-ac9cdd95ace0/volumes" Oct 01 13:49:16 crc kubenswrapper[4774]: I1001 13:49:16.168331 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:16 crc kubenswrapper[4774]: I1001 13:49:16.168997 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:16 crc kubenswrapper[4774]: I1001 13:49:16.200961 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:17 crc kubenswrapper[4774]: I1001 13:49:17.119083 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-2wbpw" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.301026 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5"] Oct 01 13:49:18 crc kubenswrapper[4774]: E1001 13:49:18.302004 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" containerName="registry-server" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.302029 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" containerName="registry-server" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.302306 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5235e6-155d-4e2d-b259-ac9cdd95ace0" containerName="registry-server" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.303666 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.307230 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cbnjv" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.316630 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5"] Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.503589 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv2wm\" (UniqueName: \"kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.503651 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.503709 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.604400 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.604520 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv2wm\" (UniqueName: \"kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.604589 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.605097 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.606420 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.639526 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv2wm\" (UniqueName: \"kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm\") pod \"1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:18 crc kubenswrapper[4774]: I1001 13:49:18.646629 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:19 crc kubenswrapper[4774]: I1001 13:49:19.150893 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5"] Oct 01 13:49:20 crc kubenswrapper[4774]: I1001 13:49:20.099413 4774 generic.go:334] "Generic (PLEG): container finished" podID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerID="3fb2954eb40342ea996ab16157072783d1b8d5e95744ce5da669eb41001dca1d" exitCode=0 Oct 01 13:49:20 crc kubenswrapper[4774]: I1001 13:49:20.099525 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" event={"ID":"6a01fafe-bffc-4df2-93bc-43dbc2c424ff","Type":"ContainerDied","Data":"3fb2954eb40342ea996ab16157072783d1b8d5e95744ce5da669eb41001dca1d"} Oct 01 13:49:20 crc kubenswrapper[4774]: I1001 13:49:20.099908 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" event={"ID":"6a01fafe-bffc-4df2-93bc-43dbc2c424ff","Type":"ContainerStarted","Data":"662a373a51ebd5c34ff50df1b9ec0f7b3dcaa829472181e11f374df8ceb80641"} Oct 01 13:49:22 crc kubenswrapper[4774]: I1001 13:49:22.121017 4774 generic.go:334] "Generic (PLEG): container finished" podID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerID="9bf6004b73176fc9859375e3582b63906a29d29bdeeca5a528e5af5cd9cc1823" exitCode=0 Oct 01 13:49:22 crc kubenswrapper[4774]: I1001 13:49:22.121092 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" event={"ID":"6a01fafe-bffc-4df2-93bc-43dbc2c424ff","Type":"ContainerDied","Data":"9bf6004b73176fc9859375e3582b63906a29d29bdeeca5a528e5af5cd9cc1823"} Oct 01 13:49:23 crc kubenswrapper[4774]: I1001 13:49:23.132024 4774 generic.go:334] "Generic (PLEG): container finished" podID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerID="d89aeacbd88566ea806204115f19cdb447ffd00ffb76467ddf6b96b018ce8004" exitCode=0 Oct 01 13:49:23 crc kubenswrapper[4774]: I1001 13:49:23.132072 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" event={"ID":"6a01fafe-bffc-4df2-93bc-43dbc2c424ff","Type":"ContainerDied","Data":"d89aeacbd88566ea806204115f19cdb447ffd00ffb76467ddf6b96b018ce8004"} Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.472215 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.591899 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle\") pod \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.591985 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util\") pod \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.592238 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv2wm\" (UniqueName: \"kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm\") pod \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\" (UID: \"6a01fafe-bffc-4df2-93bc-43dbc2c424ff\") " Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.593316 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle" (OuterVolumeSpecName: "bundle") pod "6a01fafe-bffc-4df2-93bc-43dbc2c424ff" (UID: "6a01fafe-bffc-4df2-93bc-43dbc2c424ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.601332 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm" (OuterVolumeSpecName: "kube-api-access-pv2wm") pod "6a01fafe-bffc-4df2-93bc-43dbc2c424ff" (UID: "6a01fafe-bffc-4df2-93bc-43dbc2c424ff"). InnerVolumeSpecName "kube-api-access-pv2wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.626238 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util" (OuterVolumeSpecName: "util") pod "6a01fafe-bffc-4df2-93bc-43dbc2c424ff" (UID: "6a01fafe-bffc-4df2-93bc-43dbc2c424ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.694419 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv2wm\" (UniqueName: \"kubernetes.io/projected/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-kube-api-access-pv2wm\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.694542 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:24 crc kubenswrapper[4774]: I1001 13:49:24.694570 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a01fafe-bffc-4df2-93bc-43dbc2c424ff-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:25 crc kubenswrapper[4774]: I1001 13:49:25.149296 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" event={"ID":"6a01fafe-bffc-4df2-93bc-43dbc2c424ff","Type":"ContainerDied","Data":"662a373a51ebd5c34ff50df1b9ec0f7b3dcaa829472181e11f374df8ceb80641"} Oct 01 13:49:25 crc kubenswrapper[4774]: I1001 13:49:25.149351 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="662a373a51ebd5c34ff50df1b9ec0f7b3dcaa829472181e11f374df8ceb80641" Oct 01 13:49:25 crc kubenswrapper[4774]: I1001 13:49:25.149419 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.647352 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz"] Oct 01 13:49:31 crc kubenswrapper[4774]: E1001 13:49:31.648239 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="pull" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.648259 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="pull" Oct 01 13:49:31 crc kubenswrapper[4774]: E1001 13:49:31.648291 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="extract" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.648304 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="extract" Oct 01 13:49:31 crc kubenswrapper[4774]: E1001 13:49:31.648326 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="util" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.648340 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="util" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.648532 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a01fafe-bffc-4df2-93bc-43dbc2c424ff" containerName="extract" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.649442 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.654900 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.655281 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hm6d4" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.656482 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.664128 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz"] Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.708207 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9qj\" (UniqueName: \"kubernetes.io/projected/479f4868-5316-4fbf-bb7e-dd89de941340-kube-api-access-2g9qj\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.708264 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-apiservice-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.708344 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-webhook-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.809499 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9qj\" (UniqueName: \"kubernetes.io/projected/479f4868-5316-4fbf-bb7e-dd89de941340-kube-api-access-2g9qj\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.809553 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-apiservice-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.809627 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-webhook-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.819146 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-apiservice-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.819520 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/479f4868-5316-4fbf-bb7e-dd89de941340-webhook-cert\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:31 crc kubenswrapper[4774]: I1001 13:49:31.833317 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9qj\" (UniqueName: \"kubernetes.io/projected/479f4868-5316-4fbf-bb7e-dd89de941340-kube-api-access-2g9qj\") pod \"mariadb-operator-controller-manager-566896bb75-2m2bz\" (UID: \"479f4868-5316-4fbf-bb7e-dd89de941340\") " pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:32 crc kubenswrapper[4774]: I1001 13:49:32.019470 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:32 crc kubenswrapper[4774]: I1001 13:49:32.294548 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz"] Oct 01 13:49:33 crc kubenswrapper[4774]: I1001 13:49:33.202943 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" event={"ID":"479f4868-5316-4fbf-bb7e-dd89de941340","Type":"ContainerStarted","Data":"20749c4207684f8d84b6e217713428d9058c47d6812c25e08f0b3918f78ef8f0"} Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.232231 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" event={"ID":"479f4868-5316-4fbf-bb7e-dd89de941340","Type":"ContainerStarted","Data":"0c68fe35de3051f422edff14dc984689fa25f6ceb131d203b9a158f70752f4bd"} Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.271215 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.271288 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.271339 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.271931 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:49:37 crc kubenswrapper[4774]: I1001 13:49:37.272009 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917" gracePeriod=600 Oct 01 13:49:38 crc kubenswrapper[4774]: I1001 13:49:38.245583 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917" exitCode=0 Oct 01 13:49:38 crc kubenswrapper[4774]: I1001 13:49:38.245644 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917"} Oct 01 13:49:38 crc kubenswrapper[4774]: I1001 13:49:38.245977 4774 scope.go:117] "RemoveContainer" containerID="4f720152dbaf5a7960f0dcab2fc9653460d3dce4b218e09d42a4179e76eb1660" Oct 01 13:49:39 crc kubenswrapper[4774]: I1001 13:49:39.257401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" event={"ID":"479f4868-5316-4fbf-bb7e-dd89de941340","Type":"ContainerStarted","Data":"c6c7e6cc56a9885aeaee3150754ea8bc474967a1f7fc00bd684fe54c88bfc4dc"} Oct 01 13:49:39 crc kubenswrapper[4774]: I1001 13:49:39.257820 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:39 crc kubenswrapper[4774]: I1001 13:49:39.261478 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd"} Oct 01 13:49:39 crc kubenswrapper[4774]: I1001 13:49:39.286064 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" podStartSLOduration=2.118058466 podStartE2EDuration="8.286032324s" podCreationTimestamp="2025-10-01 13:49:31 +0000 UTC" firstStartedPulling="2025-10-01 13:49:32.305217294 +0000 UTC m=+744.194847891" lastFinishedPulling="2025-10-01 13:49:38.473191152 +0000 UTC m=+750.362821749" observedRunningTime="2025-10-01 13:49:39.279710461 +0000 UTC m=+751.169341098" watchObservedRunningTime="2025-10-01 13:49:39.286032324 +0000 UTC m=+751.175662951" Oct 01 13:49:42 crc kubenswrapper[4774]: I1001 13:49:42.025853 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-566896bb75-2m2bz" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.243361 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t"] Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.244958 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.248028 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.273020 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t"] Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.290998 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.291237 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" containerName="controller-manager" containerID="cri-o://9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc" gracePeriod=30 Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.323979 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.324026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.324076 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqf5k\" (UniqueName: \"kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.333049 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.333265 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerName="route-controller-manager" containerID="cri-o://9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871" gracePeriod=30 Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.425030 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqf5k\" (UniqueName: \"kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.425801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.425996 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.426396 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.426436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.451152 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqf5k\" (UniqueName: \"kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.560205 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.641382 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.659850 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750823 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert\") pod \"212cd75f-356e-4ed5-a82a-98617024f18c\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750882 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6qs6\" (UniqueName: \"kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6\") pod \"2abdae49-e923-4ba8-92f8-376d7cde1af2\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750908 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config\") pod \"212cd75f-356e-4ed5-a82a-98617024f18c\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750927 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca\") pod \"2abdae49-e923-4ba8-92f8-376d7cde1af2\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750952 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca\") pod \"212cd75f-356e-4ed5-a82a-98617024f18c\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.750986 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert\") pod \"2abdae49-e923-4ba8-92f8-376d7cde1af2\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.751014 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config\") pod \"2abdae49-e923-4ba8-92f8-376d7cde1af2\" (UID: \"2abdae49-e923-4ba8-92f8-376d7cde1af2\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.751032 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w59xr\" (UniqueName: \"kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr\") pod \"212cd75f-356e-4ed5-a82a-98617024f18c\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.751083 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles\") pod \"212cd75f-356e-4ed5-a82a-98617024f18c\" (UID: \"212cd75f-356e-4ed5-a82a-98617024f18c\") " Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.752027 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "212cd75f-356e-4ed5-a82a-98617024f18c" (UID: "212cd75f-356e-4ed5-a82a-98617024f18c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.752444 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca" (OuterVolumeSpecName: "client-ca") pod "2abdae49-e923-4ba8-92f8-376d7cde1af2" (UID: "2abdae49-e923-4ba8-92f8-376d7cde1af2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.753108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config" (OuterVolumeSpecName: "config") pod "212cd75f-356e-4ed5-a82a-98617024f18c" (UID: "212cd75f-356e-4ed5-a82a-98617024f18c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.753158 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config" (OuterVolumeSpecName: "config") pod "2abdae49-e923-4ba8-92f8-376d7cde1af2" (UID: "2abdae49-e923-4ba8-92f8-376d7cde1af2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.753219 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca" (OuterVolumeSpecName: "client-ca") pod "212cd75f-356e-4ed5-a82a-98617024f18c" (UID: "212cd75f-356e-4ed5-a82a-98617024f18c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.758044 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "212cd75f-356e-4ed5-a82a-98617024f18c" (UID: "212cd75f-356e-4ed5-a82a-98617024f18c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.758223 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr" (OuterVolumeSpecName: "kube-api-access-w59xr") pod "212cd75f-356e-4ed5-a82a-98617024f18c" (UID: "212cd75f-356e-4ed5-a82a-98617024f18c"). InnerVolumeSpecName "kube-api-access-w59xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.758288 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6" (OuterVolumeSpecName: "kube-api-access-h6qs6") pod "2abdae49-e923-4ba8-92f8-376d7cde1af2" (UID: "2abdae49-e923-4ba8-92f8-376d7cde1af2"). InnerVolumeSpecName "kube-api-access-h6qs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.758386 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2abdae49-e923-4ba8-92f8-376d7cde1af2" (UID: "2abdae49-e923-4ba8-92f8-376d7cde1af2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.833421 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t"] Oct 01 13:49:46 crc kubenswrapper[4774]: W1001 13:49:46.844009 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9dc4bd_2e62_460b_b85d_f48db06a198f.slice/crio-9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5 WatchSource:0}: Error finding container 9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5: Status 404 returned error can't find the container with id 9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5 Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.851999 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852027 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w59xr\" (UniqueName: \"kubernetes.io/projected/212cd75f-356e-4ed5-a82a-98617024f18c-kube-api-access-w59xr\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852040 4774 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852049 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/212cd75f-356e-4ed5-a82a-98617024f18c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852059 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6qs6\" (UniqueName: \"kubernetes.io/projected/2abdae49-e923-4ba8-92f8-376d7cde1af2-kube-api-access-h6qs6\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852068 4774 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852076 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2abdae49-e923-4ba8-92f8-376d7cde1af2-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852084 4774 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/212cd75f-356e-4ed5-a82a-98617024f18c-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:46 crc kubenswrapper[4774]: I1001 13:49:46.852092 4774 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2abdae49-e923-4ba8-92f8-376d7cde1af2-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.315868 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerID="131cdcc3de6acde13685812814371dc94c239fdedf388d60f6d516e76223b1c7" exitCode=0 Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.316001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" event={"ID":"2b9dc4bd-2e62-460b-b85d-f48db06a198f","Type":"ContainerDied","Data":"131cdcc3de6acde13685812814371dc94c239fdedf388d60f6d516e76223b1c7"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.316266 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" event={"ID":"2b9dc4bd-2e62-460b-b85d-f48db06a198f","Type":"ContainerStarted","Data":"9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.320989 4774 generic.go:334] "Generic (PLEG): container finished" podID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerID="9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871" exitCode=0 Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.321049 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" event={"ID":"2abdae49-e923-4ba8-92f8-376d7cde1af2","Type":"ContainerDied","Data":"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.321062 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.321068 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z" event={"ID":"2abdae49-e923-4ba8-92f8-376d7cde1af2","Type":"ContainerDied","Data":"0099722f7b32fafffbd79588b9686c7213fb6f07003d089c6ca329624e0cc054"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.321290 4774 scope.go:117] "RemoveContainer" containerID="9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.323880 4774 generic.go:334] "Generic (PLEG): container finished" podID="212cd75f-356e-4ed5-a82a-98617024f18c" containerID="9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc" exitCode=0 Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.323927 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" event={"ID":"212cd75f-356e-4ed5-a82a-98617024f18c","Type":"ContainerDied","Data":"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.323962 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" event={"ID":"212cd75f-356e-4ed5-a82a-98617024f18c","Type":"ContainerDied","Data":"59a7c0f64dc1c7abd139659682a2ab70dc174f6221c6fb61bfaadcd45498e77d"} Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.324041 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wpmxq" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.353202 4774 scope.go:117] "RemoveContainer" containerID="9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871" Oct 01 13:49:47 crc kubenswrapper[4774]: E1001 13:49:47.353904 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871\": container with ID starting with 9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871 not found: ID does not exist" containerID="9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.353970 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871"} err="failed to get container status \"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871\": rpc error: code = NotFound desc = could not find container \"9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871\": container with ID starting with 9358e3b22f2a80245eb7ae8514ce88afa3a9bd7d82792eb9a99d11562e2f3871 not found: ID does not exist" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.354013 4774 scope.go:117] "RemoveContainer" containerID="9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.366109 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.375735 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wpmxq"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.377904 4774 scope.go:117] "RemoveContainer" containerID="9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc" Oct 01 13:49:47 crc kubenswrapper[4774]: E1001 13:49:47.378955 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc\": container with ID starting with 9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc not found: ID does not exist" containerID="9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.379012 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc"} err="failed to get container status \"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc\": rpc error: code = NotFound desc = could not find container \"9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc\": container with ID starting with 9411f35526580184dfb674f104bc15bcb7bc763fa06debeff9a9c81c55faf7fc not found: ID does not exist" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.381539 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.389262 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ntw6z"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.817114 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-649bf84764-tr55v"] Oct 01 13:49:47 crc kubenswrapper[4774]: E1001 13:49:47.817951 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerName="route-controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.817977 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerName="route-controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: E1001 13:49:47.818010 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" containerName="controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.818023 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" containerName="controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.818184 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" containerName="route-controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.818225 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" containerName="controller-manager" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.818799 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.822602 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.822791 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.823104 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.823931 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.824085 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.824254 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.832406 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.833497 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.842580 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.843034 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.843160 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.843184 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.843125 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.846336 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.847346 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649bf84764-tr55v"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.850121 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.853697 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn"] Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.867678 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-serving-cert\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.867779 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-config\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.867893 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-client-ca\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.867983 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-proxy-ca-bundles\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.868115 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwgf9\" (UniqueName: \"kubernetes.io/projected/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-kube-api-access-xwgf9\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969201 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz2nz\" (UniqueName: \"kubernetes.io/projected/a1bc5ca1-25ca-4611-8f63-35a776b093cf-kube-api-access-mz2nz\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969255 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-client-ca\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969273 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-proxy-ca-bundles\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969324 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bc5ca1-25ca-4611-8f63-35a776b093cf-serving-cert\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969360 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwgf9\" (UniqueName: \"kubernetes.io/projected/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-kube-api-access-xwgf9\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969383 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-config\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969401 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-client-ca\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969426 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-serving-cert\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.969446 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-config\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.970446 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-client-ca\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.970936 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-proxy-ca-bundles\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.971088 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-config\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.986368 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-serving-cert\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:47 crc kubenswrapper[4774]: I1001 13:49:47.997965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwgf9\" (UniqueName: \"kubernetes.io/projected/81c5e12b-c6ae-4d12-b447-0f84dafd11b3-kube-api-access-xwgf9\") pod \"controller-manager-649bf84764-tr55v\" (UID: \"81c5e12b-c6ae-4d12-b447-0f84dafd11b3\") " pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.070493 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz2nz\" (UniqueName: \"kubernetes.io/projected/a1bc5ca1-25ca-4611-8f63-35a776b093cf-kube-api-access-mz2nz\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.070563 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bc5ca1-25ca-4611-8f63-35a776b093cf-serving-cert\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.070603 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-config\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.070636 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-client-ca\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.071794 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-client-ca\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.072132 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bc5ca1-25ca-4611-8f63-35a776b093cf-config\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.076018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1bc5ca1-25ca-4611-8f63-35a776b093cf-serving-cert\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.090371 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz2nz\" (UniqueName: \"kubernetes.io/projected/a1bc5ca1-25ca-4611-8f63-35a776b093cf-kube-api-access-mz2nz\") pod \"route-controller-manager-79596b6886-phfsn\" (UID: \"a1bc5ca1-25ca-4611-8f63-35a776b093cf\") " pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.139681 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.176162 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.511917 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn"] Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.608149 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-649bf84764-tr55v"] Oct 01 13:49:48 crc kubenswrapper[4774]: W1001 13:49:48.625991 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c5e12b_c6ae_4d12_b447_0f84dafd11b3.slice/crio-79171134bea42d3fc9672e2c2005947684c779c51849c59990f1d38ae766462e WatchSource:0}: Error finding container 79171134bea42d3fc9672e2c2005947684c779c51849c59990f1d38ae766462e: Status 404 returned error can't find the container with id 79171134bea42d3fc9672e2c2005947684c779c51849c59990f1d38ae766462e Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.880073 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="212cd75f-356e-4ed5-a82a-98617024f18c" path="/var/lib/kubelet/pods/212cd75f-356e-4ed5-a82a-98617024f18c/volumes" Oct 01 13:49:48 crc kubenswrapper[4774]: I1001 13:49:48.881130 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abdae49-e923-4ba8-92f8-376d7cde1af2" path="/var/lib/kubelet/pods/2abdae49-e923-4ba8-92f8-376d7cde1af2/volumes" Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.341630 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" event={"ID":"a1bc5ca1-25ca-4611-8f63-35a776b093cf","Type":"ContainerStarted","Data":"d8054571bd0f3582856972195bdbb6a74de9c059df84ad402f4855c6811d434c"} Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.341678 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" event={"ID":"a1bc5ca1-25ca-4611-8f63-35a776b093cf","Type":"ContainerStarted","Data":"989670384abe733a0ed488713617d82ad5b57154748fc0e1774dce89162436b0"} Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.342180 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.342763 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" event={"ID":"81c5e12b-c6ae-4d12-b447-0f84dafd11b3","Type":"ContainerStarted","Data":"70032a46811df7c408d6bebe734ff87f8047ae4699ba0a00bbc9835f201317ae"} Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.342795 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" event={"ID":"81c5e12b-c6ae-4d12-b447-0f84dafd11b3","Type":"ContainerStarted","Data":"79171134bea42d3fc9672e2c2005947684c779c51849c59990f1d38ae766462e"} Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.349141 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.360784 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79596b6886-phfsn" podStartSLOduration=2.360761647 podStartE2EDuration="2.360761647s" podCreationTimestamp="2025-10-01 13:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:49:49.357244156 +0000 UTC m=+761.246874773" watchObservedRunningTime="2025-10-01 13:49:49.360761647 +0000 UTC m=+761.250392254" Oct 01 13:49:49 crc kubenswrapper[4774]: I1001 13:49:49.402028 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" podStartSLOduration=2.402012678 podStartE2EDuration="2.402012678s" podCreationTimestamp="2025-10-01 13:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:49:49.399108123 +0000 UTC m=+761.288738730" watchObservedRunningTime="2025-10-01 13:49:49.402012678 +0000 UTC m=+761.291643275" Oct 01 13:49:50 crc kubenswrapper[4774]: I1001 13:49:50.351108 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerID="6b0f0c120ae5012240f4cb92039adccbdf6461d68c7c96c471896e8b31a8eb0f" exitCode=0 Oct 01 13:49:50 crc kubenswrapper[4774]: I1001 13:49:50.351179 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" event={"ID":"2b9dc4bd-2e62-460b-b85d-f48db06a198f","Type":"ContainerDied","Data":"6b0f0c120ae5012240f4cb92039adccbdf6461d68c7c96c471896e8b31a8eb0f"} Oct 01 13:49:50 crc kubenswrapper[4774]: I1001 13:49:50.351970 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:50 crc kubenswrapper[4774]: I1001 13:49:50.357016 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-649bf84764-tr55v" Oct 01 13:49:51 crc kubenswrapper[4774]: I1001 13:49:51.362749 4774 generic.go:334] "Generic (PLEG): container finished" podID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerID="ab3af758d4c14487039cec6f7a80e0c94d3703a17c4f6b31802ca3f2c93e0c77" exitCode=0 Oct 01 13:49:51 crc kubenswrapper[4774]: I1001 13:49:51.362820 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" event={"ID":"2b9dc4bd-2e62-460b-b85d-f48db06a198f","Type":"ContainerDied","Data":"ab3af758d4c14487039cec6f7a80e0c94d3703a17c4f6b31802ca3f2c93e0c77"} Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.734182 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.832606 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle\") pod \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.832717 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util\") pod \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.832802 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqf5k\" (UniqueName: \"kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k\") pod \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\" (UID: \"2b9dc4bd-2e62-460b-b85d-f48db06a198f\") " Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.834334 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle" (OuterVolumeSpecName: "bundle") pod "2b9dc4bd-2e62-460b-b85d-f48db06a198f" (UID: "2b9dc4bd-2e62-460b-b85d-f48db06a198f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.842149 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k" (OuterVolumeSpecName: "kube-api-access-qqf5k") pod "2b9dc4bd-2e62-460b-b85d-f48db06a198f" (UID: "2b9dc4bd-2e62-460b-b85d-f48db06a198f"). InnerVolumeSpecName "kube-api-access-qqf5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.855239 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util" (OuterVolumeSpecName: "util") pod "2b9dc4bd-2e62-460b-b85d-f48db06a198f" (UID: "2b9dc4bd-2e62-460b-b85d-f48db06a198f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.937424 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.937511 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b9dc4bd-2e62-460b-b85d-f48db06a198f-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:52 crc kubenswrapper[4774]: I1001 13:49:52.937530 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqf5k\" (UniqueName: \"kubernetes.io/projected/2b9dc4bd-2e62-460b-b85d-f48db06a198f-kube-api-access-qqf5k\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:53 crc kubenswrapper[4774]: I1001 13:49:53.378560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" event={"ID":"2b9dc4bd-2e62-460b-b85d-f48db06a198f","Type":"ContainerDied","Data":"9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5"} Oct 01 13:49:53 crc kubenswrapper[4774]: I1001 13:49:53.378628 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c601d22a218313159186582d3165b6e166cc0848a77678117622dec451b8cd5" Oct 01 13:49:53 crc kubenswrapper[4774]: I1001 13:49:53.378803 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t" Oct 01 13:49:56 crc kubenswrapper[4774]: I1001 13:49:56.039133 4774 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.606116 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j"] Oct 01 13:50:01 crc kubenswrapper[4774]: E1001 13:50:01.606773 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="extract" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.606785 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="extract" Oct 01 13:50:01 crc kubenswrapper[4774]: E1001 13:50:01.606797 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="util" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.606803 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="util" Oct 01 13:50:01 crc kubenswrapper[4774]: E1001 13:50:01.606814 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="pull" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.606821 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="pull" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.606911 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9dc4bd-2e62-460b-b85d-f48db06a198f" containerName="extract" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.607229 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.609581 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.609792 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.610072 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dpmgz" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.610226 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.610223 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.670027 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j"] Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.748527 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-webhook-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.748576 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-apiservice-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.748643 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7fpj\" (UniqueName: \"kubernetes.io/projected/aa824545-2e4e-49f3-ab37-05c10785acee-kube-api-access-q7fpj\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.849660 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-webhook-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.849713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-apiservice-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.849746 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7fpj\" (UniqueName: \"kubernetes.io/projected/aa824545-2e4e-49f3-ab37-05c10785acee-kube-api-access-q7fpj\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.855016 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-apiservice-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.855418 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aa824545-2e4e-49f3-ab37-05c10785acee-webhook-cert\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.870293 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7fpj\" (UniqueName: \"kubernetes.io/projected/aa824545-2e4e-49f3-ab37-05c10785acee-kube-api-access-q7fpj\") pod \"metallb-operator-controller-manager-597c4b7b96-2jb5j\" (UID: \"aa824545-2e4e-49f3-ab37-05c10785acee\") " pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.923278 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt"] Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.923290 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.923908 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.926047 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.926341 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pkxrs" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.936898 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 13:50:01 crc kubenswrapper[4774]: I1001 13:50:01.952161 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt"] Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.053289 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.053363 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfhc\" (UniqueName: \"kubernetes.io/projected/b9d7f73a-7289-4825-9659-d330a9496ae1-kube-api-access-whfhc\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.053418 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-webhook-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.154443 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-webhook-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.154510 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.154554 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfhc\" (UniqueName: \"kubernetes.io/projected/b9d7f73a-7289-4825-9659-d330a9496ae1-kube-api-access-whfhc\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.157194 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.158368 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.159792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-apiservice-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.179852 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.180997 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfhc\" (UniqueName: \"kubernetes.io/projected/b9d7f73a-7289-4825-9659-d330a9496ae1-kube-api-access-whfhc\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.184140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9d7f73a-7289-4825-9659-d330a9496ae1-webhook-cert\") pod \"metallb-operator-webhook-server-7b9b85bd76-p6smt\" (UID: \"b9d7f73a-7289-4825-9659-d330a9496ae1\") " pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.236914 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.256119 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.256266 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hsm4\" (UniqueName: \"kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.256342 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.358816 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.358905 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hsm4\" (UniqueName: \"kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.358947 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.359682 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.360131 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.374938 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j"] Oct 01 13:50:02 crc kubenswrapper[4774]: W1001 13:50:02.380367 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa824545_2e4e_49f3_ab37_05c10785acee.slice/crio-dcd9f1df18d7d9abdd2d19647a93e55a9f9ed3710fc473bbf3e8c3546a74d9ce WatchSource:0}: Error finding container dcd9f1df18d7d9abdd2d19647a93e55a9f9ed3710fc473bbf3e8c3546a74d9ce: Status 404 returned error can't find the container with id dcd9f1df18d7d9abdd2d19647a93e55a9f9ed3710fc473bbf3e8c3546a74d9ce Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.389738 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hsm4\" (UniqueName: \"kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4\") pod \"community-operators-xtx9d\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.439742 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" event={"ID":"aa824545-2e4e-49f3-ab37-05c10785acee","Type":"ContainerStarted","Data":"dcd9f1df18d7d9abdd2d19647a93e55a9f9ed3710fc473bbf3e8c3546a74d9ce"} Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.493644 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.658379 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt"] Oct 01 13:50:02 crc kubenswrapper[4774]: W1001 13:50:02.666251 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d7f73a_7289_4825_9659_d330a9496ae1.slice/crio-954cf52f7ea0da706bd1a46fe0eca24f1d132fa0b0e54d5b90e5f9715b193c7e WatchSource:0}: Error finding container 954cf52f7ea0da706bd1a46fe0eca24f1d132fa0b0e54d5b90e5f9715b193c7e: Status 404 returned error can't find the container with id 954cf52f7ea0da706bd1a46fe0eca24f1d132fa0b0e54d5b90e5f9715b193c7e Oct 01 13:50:02 crc kubenswrapper[4774]: I1001 13:50:02.866987 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:02 crc kubenswrapper[4774]: W1001 13:50:02.882365 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee02ff9_a48f_4e0d_b233_517378c02151.slice/crio-de128d9db29822af25b103a0f602e664a4a541210f25d3bb4804f46bbc5c4745 WatchSource:0}: Error finding container de128d9db29822af25b103a0f602e664a4a541210f25d3bb4804f46bbc5c4745: Status 404 returned error can't find the container with id de128d9db29822af25b103a0f602e664a4a541210f25d3bb4804f46bbc5c4745 Oct 01 13:50:03 crc kubenswrapper[4774]: I1001 13:50:03.446918 4774 generic.go:334] "Generic (PLEG): container finished" podID="cee02ff9-a48f-4e0d-b233-517378c02151" containerID="38a7d0828b74d3eca7cd89f2b9e85989960f857028021fe374ebaeb10dfca5a2" exitCode=0 Oct 01 13:50:03 crc kubenswrapper[4774]: I1001 13:50:03.446963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerDied","Data":"38a7d0828b74d3eca7cd89f2b9e85989960f857028021fe374ebaeb10dfca5a2"} Oct 01 13:50:03 crc kubenswrapper[4774]: I1001 13:50:03.447002 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerStarted","Data":"de128d9db29822af25b103a0f602e664a4a541210f25d3bb4804f46bbc5c4745"} Oct 01 13:50:03 crc kubenswrapper[4774]: I1001 13:50:03.448310 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" event={"ID":"b9d7f73a-7289-4825-9659-d330a9496ae1","Type":"ContainerStarted","Data":"954cf52f7ea0da706bd1a46fe0eca24f1d132fa0b0e54d5b90e5f9715b193c7e"} Oct 01 13:50:05 crc kubenswrapper[4774]: I1001 13:50:05.470943 4774 generic.go:334] "Generic (PLEG): container finished" podID="cee02ff9-a48f-4e0d-b233-517378c02151" containerID="25dd750abc7a66c482a6c3bb87ae8cd2c03c62a510ebd6040caf63216c815f00" exitCode=0 Oct 01 13:50:05 crc kubenswrapper[4774]: I1001 13:50:05.471603 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerDied","Data":"25dd750abc7a66c482a6c3bb87ae8cd2c03c62a510ebd6040caf63216c815f00"} Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.568229 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.569652 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.576968 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.626706 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.626759 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7h7x\" (UniqueName: \"kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.626804 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.728087 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.728137 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7h7x\" (UniqueName: \"kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.728168 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.728705 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.728811 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.759239 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7h7x\" (UniqueName: \"kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x\") pod \"redhat-marketplace-gn958\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:07 crc kubenswrapper[4774]: I1001 13:50:07.906243 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.407435 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:08 crc kubenswrapper[4774]: W1001 13:50:08.415010 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09239625_284f_49ea_82c3_ba25e22cf6a0.slice/crio-a9d3a6efc6fa256d21f39b51528eac768032c6897f18c0bc4542f21bbe5f73e7 WatchSource:0}: Error finding container a9d3a6efc6fa256d21f39b51528eac768032c6897f18c0bc4542f21bbe5f73e7: Status 404 returned error can't find the container with id a9d3a6efc6fa256d21f39b51528eac768032c6897f18c0bc4542f21bbe5f73e7 Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.493711 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" event={"ID":"b9d7f73a-7289-4825-9659-d330a9496ae1","Type":"ContainerStarted","Data":"82494ae324673bf9044e1fe02c510f812eedfe414c43451c4eb6c96d165930d8"} Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.493877 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.495172 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" event={"ID":"aa824545-2e4e-49f3-ab37-05c10785acee","Type":"ContainerStarted","Data":"819c9a587da6db1f37ebf1ae868202ae37337389b821db5083c7236dae6f3105"} Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.495317 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.496693 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerStarted","Data":"1575e583c35319a202ac047fc06d06544e09235ee4f40feaaad1c499d7da7c46"} Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.498259 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerStarted","Data":"a9d3a6efc6fa256d21f39b51528eac768032c6897f18c0bc4542f21bbe5f73e7"} Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.518896 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" podStartSLOduration=2.591976383 podStartE2EDuration="7.518881383s" podCreationTimestamp="2025-10-01 13:50:01 +0000 UTC" firstStartedPulling="2025-10-01 13:50:02.669751347 +0000 UTC m=+774.559381944" lastFinishedPulling="2025-10-01 13:50:07.596656347 +0000 UTC m=+779.486286944" observedRunningTime="2025-10-01 13:50:08.516945063 +0000 UTC m=+780.406575680" watchObservedRunningTime="2025-10-01 13:50:08.518881383 +0000 UTC m=+780.408511980" Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.568742 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" podStartSLOduration=2.50169197 podStartE2EDuration="7.568720775s" podCreationTimestamp="2025-10-01 13:50:01 +0000 UTC" firstStartedPulling="2025-10-01 13:50:02.388023789 +0000 UTC m=+774.277654386" lastFinishedPulling="2025-10-01 13:50:07.455052584 +0000 UTC m=+779.344683191" observedRunningTime="2025-10-01 13:50:08.565134003 +0000 UTC m=+780.454764650" watchObservedRunningTime="2025-10-01 13:50:08.568720775 +0000 UTC m=+780.458351372" Oct 01 13:50:08 crc kubenswrapper[4774]: I1001 13:50:08.587860 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xtx9d" podStartSLOduration=2.486570358 podStartE2EDuration="6.587842477s" podCreationTimestamp="2025-10-01 13:50:02 +0000 UTC" firstStartedPulling="2025-10-01 13:50:03.449143808 +0000 UTC m=+775.338774405" lastFinishedPulling="2025-10-01 13:50:07.550415927 +0000 UTC m=+779.440046524" observedRunningTime="2025-10-01 13:50:08.585762423 +0000 UTC m=+780.475393020" watchObservedRunningTime="2025-10-01 13:50:08.587842477 +0000 UTC m=+780.477473074" Oct 01 13:50:09 crc kubenswrapper[4774]: I1001 13:50:09.504932 4774 generic.go:334] "Generic (PLEG): container finished" podID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerID="24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70" exitCode=0 Oct 01 13:50:09 crc kubenswrapper[4774]: I1001 13:50:09.505043 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerDied","Data":"24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70"} Oct 01 13:50:10 crc kubenswrapper[4774]: I1001 13:50:10.511973 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerStarted","Data":"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198"} Oct 01 13:50:11 crc kubenswrapper[4774]: I1001 13:50:11.523383 4774 generic.go:334] "Generic (PLEG): container finished" podID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerID="cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198" exitCode=0 Oct 01 13:50:11 crc kubenswrapper[4774]: I1001 13:50:11.523507 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerDied","Data":"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198"} Oct 01 13:50:12 crc kubenswrapper[4774]: I1001 13:50:12.494655 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:12 crc kubenswrapper[4774]: I1001 13:50:12.495511 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:12 crc kubenswrapper[4774]: I1001 13:50:12.531785 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerStarted","Data":"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4"} Oct 01 13:50:12 crc kubenswrapper[4774]: I1001 13:50:12.553324 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:12 crc kubenswrapper[4774]: I1001 13:50:12.561441 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gn958" podStartSLOduration=2.844542827 podStartE2EDuration="5.561416212s" podCreationTimestamp="2025-10-01 13:50:07 +0000 UTC" firstStartedPulling="2025-10-01 13:50:09.506970023 +0000 UTC m=+781.396600630" lastFinishedPulling="2025-10-01 13:50:12.223843418 +0000 UTC m=+784.113474015" observedRunningTime="2025-10-01 13:50:12.560378805 +0000 UTC m=+784.450009412" watchObservedRunningTime="2025-10-01 13:50:12.561416212 +0000 UTC m=+784.451046829" Oct 01 13:50:13 crc kubenswrapper[4774]: I1001 13:50:13.613335 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:15 crc kubenswrapper[4774]: I1001 13:50:15.952543 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:15 crc kubenswrapper[4774]: I1001 13:50:15.953403 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xtx9d" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="registry-server" containerID="cri-o://1575e583c35319a202ac047fc06d06544e09235ee4f40feaaad1c499d7da7c46" gracePeriod=2 Oct 01 13:50:16 crc kubenswrapper[4774]: I1001 13:50:16.567313 4774 generic.go:334] "Generic (PLEG): container finished" podID="cee02ff9-a48f-4e0d-b233-517378c02151" containerID="1575e583c35319a202ac047fc06d06544e09235ee4f40feaaad1c499d7da7c46" exitCode=0 Oct 01 13:50:16 crc kubenswrapper[4774]: I1001 13:50:16.567387 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerDied","Data":"1575e583c35319a202ac047fc06d06544e09235ee4f40feaaad1c499d7da7c46"} Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.002076 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.051701 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content\") pod \"cee02ff9-a48f-4e0d-b233-517378c02151\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.051804 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities\") pod \"cee02ff9-a48f-4e0d-b233-517378c02151\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.051866 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hsm4\" (UniqueName: \"kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4\") pod \"cee02ff9-a48f-4e0d-b233-517378c02151\" (UID: \"cee02ff9-a48f-4e0d-b233-517378c02151\") " Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.052552 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities" (OuterVolumeSpecName: "utilities") pod "cee02ff9-a48f-4e0d-b233-517378c02151" (UID: "cee02ff9-a48f-4e0d-b233-517378c02151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.082685 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4" (OuterVolumeSpecName: "kube-api-access-2hsm4") pod "cee02ff9-a48f-4e0d-b233-517378c02151" (UID: "cee02ff9-a48f-4e0d-b233-517378c02151"). InnerVolumeSpecName "kube-api-access-2hsm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.105930 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cee02ff9-a48f-4e0d-b233-517378c02151" (UID: "cee02ff9-a48f-4e0d-b233-517378c02151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.152578 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hsm4\" (UniqueName: \"kubernetes.io/projected/cee02ff9-a48f-4e0d-b233-517378c02151-kube-api-access-2hsm4\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.152611 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.152620 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee02ff9-a48f-4e0d-b233-517378c02151-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.577316 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xtx9d" event={"ID":"cee02ff9-a48f-4e0d-b233-517378c02151","Type":"ContainerDied","Data":"de128d9db29822af25b103a0f602e664a4a541210f25d3bb4804f46bbc5c4745"} Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.577387 4774 scope.go:117] "RemoveContainer" containerID="1575e583c35319a202ac047fc06d06544e09235ee4f40feaaad1c499d7da7c46" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.577629 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xtx9d" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.614131 4774 scope.go:117] "RemoveContainer" containerID="25dd750abc7a66c482a6c3bb87ae8cd2c03c62a510ebd6040caf63216c815f00" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.648587 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.661414 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xtx9d"] Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.665890 4774 scope.go:117] "RemoveContainer" containerID="38a7d0828b74d3eca7cd89f2b9e85989960f857028021fe374ebaeb10dfca5a2" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.906493 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.906808 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:17 crc kubenswrapper[4774]: I1001 13:50:17.978542 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:18 crc kubenswrapper[4774]: I1001 13:50:18.628332 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:18 crc kubenswrapper[4774]: I1001 13:50:18.879667 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" path="/var/lib/kubelet/pods/cee02ff9-a48f-4e0d-b233-517378c02151/volumes" Oct 01 13:50:21 crc kubenswrapper[4774]: I1001 13:50:21.753381 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:21 crc kubenswrapper[4774]: I1001 13:50:21.754291 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gn958" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="registry-server" containerID="cri-o://0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4" gracePeriod=2 Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.240380 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7b9b85bd76-p6smt" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.285620 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.320484 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content\") pod \"09239625-284f-49ea-82c3-ba25e22cf6a0\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.320903 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities\") pod \"09239625-284f-49ea-82c3-ba25e22cf6a0\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.320995 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7h7x\" (UniqueName: \"kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x\") pod \"09239625-284f-49ea-82c3-ba25e22cf6a0\" (UID: \"09239625-284f-49ea-82c3-ba25e22cf6a0\") " Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.321495 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities" (OuterVolumeSpecName: "utilities") pod "09239625-284f-49ea-82c3-ba25e22cf6a0" (UID: "09239625-284f-49ea-82c3-ba25e22cf6a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.325738 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x" (OuterVolumeSpecName: "kube-api-access-w7h7x") pod "09239625-284f-49ea-82c3-ba25e22cf6a0" (UID: "09239625-284f-49ea-82c3-ba25e22cf6a0"). InnerVolumeSpecName "kube-api-access-w7h7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.333070 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09239625-284f-49ea-82c3-ba25e22cf6a0" (UID: "09239625-284f-49ea-82c3-ba25e22cf6a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.422724 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7h7x\" (UniqueName: \"kubernetes.io/projected/09239625-284f-49ea-82c3-ba25e22cf6a0-kube-api-access-w7h7x\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.422758 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.422768 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09239625-284f-49ea-82c3-ba25e22cf6a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.609811 4774 generic.go:334] "Generic (PLEG): container finished" podID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerID="0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4" exitCode=0 Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.609875 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerDied","Data":"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4"} Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.610292 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gn958" event={"ID":"09239625-284f-49ea-82c3-ba25e22cf6a0","Type":"ContainerDied","Data":"a9d3a6efc6fa256d21f39b51528eac768032c6897f18c0bc4542f21bbe5f73e7"} Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.609938 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gn958" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.610314 4774 scope.go:117] "RemoveContainer" containerID="0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.630908 4774 scope.go:117] "RemoveContainer" containerID="cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.655499 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.659308 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gn958"] Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.675543 4774 scope.go:117] "RemoveContainer" containerID="24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.690769 4774 scope.go:117] "RemoveContainer" containerID="0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4" Oct 01 13:50:22 crc kubenswrapper[4774]: E1001 13:50:22.691402 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4\": container with ID starting with 0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4 not found: ID does not exist" containerID="0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.691468 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4"} err="failed to get container status \"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4\": rpc error: code = NotFound desc = could not find container \"0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4\": container with ID starting with 0c20a8dab13425935c11100eabccc580f672a66e352ed2ebe98dc674aa549ab4 not found: ID does not exist" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.691499 4774 scope.go:117] "RemoveContainer" containerID="cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198" Oct 01 13:50:22 crc kubenswrapper[4774]: E1001 13:50:22.691956 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198\": container with ID starting with cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198 not found: ID does not exist" containerID="cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.692010 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198"} err="failed to get container status \"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198\": rpc error: code = NotFound desc = could not find container \"cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198\": container with ID starting with cad69e3ae789cb032409d3a3fe574cb9a63543fd2fdccd7a7fe6f657081f5198 not found: ID does not exist" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.692046 4774 scope.go:117] "RemoveContainer" containerID="24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70" Oct 01 13:50:22 crc kubenswrapper[4774]: E1001 13:50:22.692377 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70\": container with ID starting with 24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70 not found: ID does not exist" containerID="24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.692404 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70"} err="failed to get container status \"24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70\": rpc error: code = NotFound desc = could not find container \"24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70\": container with ID starting with 24967e8d1596409052f75c6b2e8d0393c32af2a0743bdb436937209ee3937a70 not found: ID does not exist" Oct 01 13:50:22 crc kubenswrapper[4774]: I1001 13:50:22.877737 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" path="/var/lib/kubelet/pods/09239625-284f-49ea-82c3-ba25e22cf6a0/volumes" Oct 01 13:50:41 crc kubenswrapper[4774]: I1001 13:50:41.927566 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-597c4b7b96-2jb5j" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781295 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-59tlr"] Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781653 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="extract-utilities" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781683 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="extract-utilities" Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781710 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="extract-utilities" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781721 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="extract-utilities" Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781743 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781756 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781771 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781782 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781804 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="extract-content" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781815 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="extract-content" Oct 01 13:50:42 crc kubenswrapper[4774]: E1001 13:50:42.781833 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="extract-content" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.781870 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="extract-content" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.782044 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee02ff9-a48f-4e0d-b233-517378c02151" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.782061 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="09239625-284f-49ea-82c3-ba25e22cf6a0" containerName="registry-server" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.788734 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.792660 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xr9wv" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.796372 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.801613 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.818244 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-8g948"] Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.818941 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.833368 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.846526 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-8g948"] Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-reloader\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900118 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-sockets\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900140 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-conf\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900163 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900190 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900221 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpv9\" (UniqueName: \"kubernetes.io/projected/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-kube-api-access-jnpv9\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900248 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900270 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-startup\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.900314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprgg\" (UniqueName: \"kubernetes.io/projected/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-kube-api-access-lprgg\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.929218 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2nzm9"] Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.930088 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2nzm9" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.931701 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.931914 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-hvzfh" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.931932 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.932512 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.940596 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-rhh9b"] Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.941429 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.945725 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 13:50:42 crc kubenswrapper[4774]: I1001 13:50:42.957988 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-rhh9b"] Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.000903 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.000941 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metrics-certs\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.000963 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lprgg\" (UniqueName: \"kubernetes.io/projected/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-kube-api-access-lprgg\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.000983 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-metrics-certs\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001012 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4db8s\" (UniqueName: \"kubernetes.io/projected/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-kube-api-access-4db8s\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001027 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-cert\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001043 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metallb-excludel2\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001065 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-reloader\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001082 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-sockets\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001095 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-conf\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001109 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001129 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001148 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpv9\" (UniqueName: \"kubernetes.io/projected/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-kube-api-access-jnpv9\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001167 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001183 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-startup\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001199 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kzwd\" (UniqueName: \"kubernetes.io/projected/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-kube-api-access-4kzwd\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.001818 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-reloader\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.002018 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-sockets\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.002190 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-conf\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.002251 4774 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.002293 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs podName:e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e nodeName:}" failed. No retries permitted until 2025-10-01 13:50:43.50227775 +0000 UTC m=+815.391908347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs") pod "frr-k8s-59tlr" (UID: "e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e") : secret "frr-k8s-certs-secret" not found Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.002599 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.002690 4774 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.002785 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert podName:bf28ca5c-6c07-4447-bfde-7c86bd08f4ae nodeName:}" failed. No retries permitted until 2025-10-01 13:50:43.502758603 +0000 UTC m=+815.392389290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert") pod "frr-k8s-webhook-server-5478bdb765-8g948" (UID: "bf28ca5c-6c07-4447-bfde-7c86bd08f4ae") : secret "frr-k8s-webhook-server-cert" not found Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.003228 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-frr-startup\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.019325 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprgg\" (UniqueName: \"kubernetes.io/projected/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-kube-api-access-lprgg\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.019624 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpv9\" (UniqueName: \"kubernetes.io/projected/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-kube-api-access-jnpv9\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.102640 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kzwd\" (UniqueName: \"kubernetes.io/projected/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-kube-api-access-4kzwd\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.102950 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.102968 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metrics-certs\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.102990 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-metrics-certs\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.103019 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4db8s\" (UniqueName: \"kubernetes.io/projected/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-kube-api-access-4db8s\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.103035 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-cert\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.103051 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metallb-excludel2\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.103053 4774 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.103118 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist podName:1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc nodeName:}" failed. No retries permitted until 2025-10-01 13:50:43.603098379 +0000 UTC m=+815.492728976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist") pod "speaker-2nzm9" (UID: "1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc") : secret "metallb-memberlist" not found Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.103684 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metallb-excludel2\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.109008 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-metrics-certs\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.109047 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-metrics-certs\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.109273 4774 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.118490 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-cert\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.120001 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4db8s\" (UniqueName: \"kubernetes.io/projected/5beb15f9-6d7c-4a0e-b107-4b91e645f9a0-kube-api-access-4db8s\") pod \"controller-5d688f5ffc-rhh9b\" (UID: \"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0\") " pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.125820 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kzwd\" (UniqueName: \"kubernetes.io/projected/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-kube-api-access-4kzwd\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.253500 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.507302 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.507407 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.512324 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e-metrics-certs\") pod \"frr-k8s-59tlr\" (UID: \"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e\") " pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.512711 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf28ca5c-6c07-4447-bfde-7c86bd08f4ae-cert\") pod \"frr-k8s-webhook-server-5478bdb765-8g948\" (UID: \"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.608732 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.609042 4774 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:50:43 crc kubenswrapper[4774]: E1001 13:50:43.609143 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist podName:1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc nodeName:}" failed. No retries permitted until 2025-10-01 13:50:44.609115715 +0000 UTC m=+816.498746342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist") pod "speaker-2nzm9" (UID: "1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc") : secret "metallb-memberlist" not found Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.704307 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.731331 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:43 crc kubenswrapper[4774]: I1001 13:50:43.754347 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-rhh9b"] Oct 01 13:50:43 crc kubenswrapper[4774]: W1001 13:50:43.781941 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5beb15f9_6d7c_4a0e_b107_4b91e645f9a0.slice/crio-f2cc5b0e948166a2f69b337ae083214c5b6fcbd22848e83c04a31b3b5b03b779 WatchSource:0}: Error finding container f2cc5b0e948166a2f69b337ae083214c5b6fcbd22848e83c04a31b3b5b03b779: Status 404 returned error can't find the container with id f2cc5b0e948166a2f69b337ae083214c5b6fcbd22848e83c04a31b3b5b03b779 Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.212201 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-8g948"] Oct 01 13:50:44 crc kubenswrapper[4774]: W1001 13:50:44.231682 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf28ca5c_6c07_4447_bfde_7c86bd08f4ae.slice/crio-b5cf0f3df20568a70ddd78ed9b42fe21074269f265a66643cba8ad6269de1209 WatchSource:0}: Error finding container b5cf0f3df20568a70ddd78ed9b42fe21074269f265a66643cba8ad6269de1209: Status 404 returned error can't find the container with id b5cf0f3df20568a70ddd78ed9b42fe21074269f265a66643cba8ad6269de1209 Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.621129 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.627052 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc-memberlist\") pod \"speaker-2nzm9\" (UID: \"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc\") " pod="metallb-system/speaker-2nzm9" Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.743228 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2nzm9" Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.766740 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-rhh9b" event={"ID":"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0","Type":"ContainerStarted","Data":"9b57754dcfae38116788e8f3ffd18828511949214eee2b8b322b88db5bf80f66"} Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.766785 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-rhh9b" event={"ID":"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0","Type":"ContainerStarted","Data":"f2cc5b0e948166a2f69b337ae083214c5b6fcbd22848e83c04a31b3b5b03b779"} Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.767714 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"3ac2e3cf29ca50973e4d42db6277589dca2e9d7788a625bed0b3bdd1cd177a5b"} Oct 01 13:50:44 crc kubenswrapper[4774]: I1001 13:50:44.773442 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" event={"ID":"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae","Type":"ContainerStarted","Data":"b5cf0f3df20568a70ddd78ed9b42fe21074269f265a66643cba8ad6269de1209"} Oct 01 13:50:45 crc kubenswrapper[4774]: I1001 13:50:45.799375 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2nzm9" event={"ID":"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc","Type":"ContainerStarted","Data":"0379141efc02e630c96d6a605d229426e4ff3b4e6831e862c951caf9750d7fa9"} Oct 01 13:50:45 crc kubenswrapper[4774]: I1001 13:50:45.799662 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2nzm9" event={"ID":"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc","Type":"ContainerStarted","Data":"a20bbae20c1036130e6ed3ea1b86e8689872e2d0d88e756131e60d5f2e52a0c3"} Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.809944 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2nzm9" event={"ID":"1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc","Type":"ContainerStarted","Data":"043ff0d240c8cc222a761f8537696876896179729cd89ebc9b0f75c7f81e6b49"} Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.811394 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2nzm9" Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.812507 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-rhh9b" event={"ID":"5beb15f9-6d7c-4a0e-b107-4b91e645f9a0","Type":"ContainerStarted","Data":"4fcd653260015032d6191366edb3b7aa1d563a04130a079ecf08bf94ad2c9d9a"} Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.813493 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.833983 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2nzm9" podStartSLOduration=3.307313059 podStartE2EDuration="5.833964497s" podCreationTimestamp="2025-10-01 13:50:42 +0000 UTC" firstStartedPulling="2025-10-01 13:50:44.999482594 +0000 UTC m=+816.889113191" lastFinishedPulling="2025-10-01 13:50:47.526134032 +0000 UTC m=+819.415764629" observedRunningTime="2025-10-01 13:50:47.830492021 +0000 UTC m=+819.720122628" watchObservedRunningTime="2025-10-01 13:50:47.833964497 +0000 UTC m=+819.723595094" Oct 01 13:50:47 crc kubenswrapper[4774]: I1001 13:50:47.848871 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-rhh9b" podStartSLOduration=2.283295264 podStartE2EDuration="5.848849399s" podCreationTimestamp="2025-10-01 13:50:42 +0000 UTC" firstStartedPulling="2025-10-01 13:50:43.952532615 +0000 UTC m=+815.842163212" lastFinishedPulling="2025-10-01 13:50:47.51808675 +0000 UTC m=+819.407717347" observedRunningTime="2025-10-01 13:50:47.847267555 +0000 UTC m=+819.736898172" watchObservedRunningTime="2025-10-01 13:50:47.848849399 +0000 UTC m=+819.738479996" Oct 01 13:50:51 crc kubenswrapper[4774]: I1001 13:50:51.840030 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" event={"ID":"bf28ca5c-6c07-4447-bfde-7c86bd08f4ae","Type":"ContainerStarted","Data":"25d2d41873295ba1b9653b5d11d856dd9355b92a2cb8e8f81a396f9701660fba"} Oct 01 13:50:51 crc kubenswrapper[4774]: I1001 13:50:51.840584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:50:51 crc kubenswrapper[4774]: I1001 13:50:51.842551 4774 generic.go:334] "Generic (PLEG): container finished" podID="e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e" containerID="bfd2acc0ff9617a7ba2440a2c24b8535f25e2136083dc1c56db424984f8b4c33" exitCode=0 Oct 01 13:50:51 crc kubenswrapper[4774]: I1001 13:50:51.842610 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerDied","Data":"bfd2acc0ff9617a7ba2440a2c24b8535f25e2136083dc1c56db424984f8b4c33"} Oct 01 13:50:51 crc kubenswrapper[4774]: I1001 13:50:51.864973 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" podStartSLOduration=2.934339531 podStartE2EDuration="9.864946286s" podCreationTimestamp="2025-10-01 13:50:42 +0000 UTC" firstStartedPulling="2025-10-01 13:50:44.23400882 +0000 UTC m=+816.123639417" lastFinishedPulling="2025-10-01 13:50:51.164615585 +0000 UTC m=+823.054246172" observedRunningTime="2025-10-01 13:50:51.863860446 +0000 UTC m=+823.753491083" watchObservedRunningTime="2025-10-01 13:50:51.864946286 +0000 UTC m=+823.754576913" Oct 01 13:50:52 crc kubenswrapper[4774]: I1001 13:50:52.853210 4774 generic.go:334] "Generic (PLEG): container finished" podID="e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e" containerID="8584d25dc1992fdb0f2d64b5b9d4542856642da5ad58588d8947f7bab05468d6" exitCode=0 Oct 01 13:50:52 crc kubenswrapper[4774]: I1001 13:50:52.853305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerDied","Data":"8584d25dc1992fdb0f2d64b5b9d4542856642da5ad58588d8947f7bab05468d6"} Oct 01 13:50:53 crc kubenswrapper[4774]: I1001 13:50:53.260535 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-rhh9b" Oct 01 13:50:53 crc kubenswrapper[4774]: I1001 13:50:53.865007 4774 generic.go:334] "Generic (PLEG): container finished" podID="e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e" containerID="4fedba6902de27ac35e9341ccc09b84f0f4d30659e98cd05dff712ec6331dfeb" exitCode=0 Oct 01 13:50:53 crc kubenswrapper[4774]: I1001 13:50:53.865121 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerDied","Data":"4fedba6902de27ac35e9341ccc09b84f0f4d30659e98cd05dff712ec6331dfeb"} Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.751611 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2nzm9" Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.878352 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"417f52fe29eed832543483bee1973fee8108c9c71296fea16d77bb767334af07"} Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.878395 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"2aab094a3a068b50fadd0451e24061bb07ef65b3393b410947dfd7c5eb25cf7e"} Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.878408 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"8f3564f5f1ae9ffc0d282e1ce1cb511356830c631255fb5d5797b77d85e74bde"} Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.878423 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"4c5a2a12c245c439bd0efd8c376312c48798af2a1b99c050524e053fdb1aeccd"} Oct 01 13:50:54 crc kubenswrapper[4774]: I1001 13:50:54.878436 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"51b68cad9b5661dc03e4219a25464a2d36d4fd5c104d10b4d4aac6794965af18"} Oct 01 13:50:55 crc kubenswrapper[4774]: I1001 13:50:55.892714 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-59tlr" event={"ID":"e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e","Type":"ContainerStarted","Data":"f934f50cfae8ec49a21b3c525ea8a2b2c7d12be565f5f8c6222a292754e36618"} Oct 01 13:50:55 crc kubenswrapper[4774]: I1001 13:50:55.893158 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:55 crc kubenswrapper[4774]: I1001 13:50:55.921270 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-59tlr" podStartSLOduration=6.670675341 podStartE2EDuration="13.921242096s" podCreationTimestamp="2025-10-01 13:50:42 +0000 UTC" firstStartedPulling="2025-10-01 13:50:43.902872511 +0000 UTC m=+815.792503118" lastFinishedPulling="2025-10-01 13:50:51.153439236 +0000 UTC m=+823.043069873" observedRunningTime="2025-10-01 13:50:55.917131762 +0000 UTC m=+827.806762359" watchObservedRunningTime="2025-10-01 13:50:55.921242096 +0000 UTC m=+827.810872723" Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.761990 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-7cfsk"] Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.763062 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.765323 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-7g5zb" Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.783901 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-7cfsk"] Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.829225 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tn8\" (UniqueName: \"kubernetes.io/projected/6d988a79-de91-4635-b12e-5bd8b0705e36-kube-api-access-t7tn8\") pod \"infra-operator-index-7cfsk\" (UID: \"6d988a79-de91-4635-b12e-5bd8b0705e36\") " pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.932090 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tn8\" (UniqueName: \"kubernetes.io/projected/6d988a79-de91-4635-b12e-5bd8b0705e36-kube-api-access-t7tn8\") pod \"infra-operator-index-7cfsk\" (UID: \"6d988a79-de91-4635-b12e-5bd8b0705e36\") " pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:50:57 crc kubenswrapper[4774]: I1001 13:50:57.961138 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tn8\" (UniqueName: \"kubernetes.io/projected/6d988a79-de91-4635-b12e-5bd8b0705e36-kube-api-access-t7tn8\") pod \"infra-operator-index-7cfsk\" (UID: \"6d988a79-de91-4635-b12e-5bd8b0705e36\") " pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:50:58 crc kubenswrapper[4774]: I1001 13:50:58.087085 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:50:58 crc kubenswrapper[4774]: I1001 13:50:58.589091 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-7cfsk"] Oct 01 13:50:58 crc kubenswrapper[4774]: I1001 13:50:58.704778 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:58 crc kubenswrapper[4774]: I1001 13:50:58.768654 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:50:58 crc kubenswrapper[4774]: I1001 13:50:58.911980 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7cfsk" event={"ID":"6d988a79-de91-4635-b12e-5bd8b0705e36","Type":"ContainerStarted","Data":"d7e814ec3827ef396810119bf8ae6559fea4e6c45a67987fc749803bd8501976"} Oct 01 13:51:00 crc kubenswrapper[4774]: I1001 13:51:00.927830 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-7cfsk" event={"ID":"6d988a79-de91-4635-b12e-5bd8b0705e36","Type":"ContainerStarted","Data":"38579e5fa038cf7ca45e8bac3e2a5ccea541dbe9556f47651345fe9583c7db99"} Oct 01 13:51:00 crc kubenswrapper[4774]: I1001 13:51:00.953406 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-7cfsk" podStartSLOduration=3.144664106 podStartE2EDuration="3.953373276s" podCreationTimestamp="2025-10-01 13:50:57 +0000 UTC" firstStartedPulling="2025-10-01 13:50:58.597731478 +0000 UTC m=+830.487362115" lastFinishedPulling="2025-10-01 13:50:59.406440668 +0000 UTC m=+831.296071285" observedRunningTime="2025-10-01 13:51:00.948751269 +0000 UTC m=+832.838381946" watchObservedRunningTime="2025-10-01 13:51:00.953373276 +0000 UTC m=+832.843003913" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.369508 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.371746 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.397909 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.398756 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.398896 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.399004 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9brlp\" (UniqueName: \"kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.501728 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.501949 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9brlp\" (UniqueName: \"kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.502018 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.502934 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.502976 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.544612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9brlp\" (UniqueName: \"kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp\") pod \"redhat-operators-t9v2f\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:02 crc kubenswrapper[4774]: I1001 13:51:02.733294 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:03 crc kubenswrapper[4774]: I1001 13:51:03.177276 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:03 crc kubenswrapper[4774]: I1001 13:51:03.736271 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-8g948" Oct 01 13:51:03 crc kubenswrapper[4774]: I1001 13:51:03.949300 4774 generic.go:334] "Generic (PLEG): container finished" podID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerID="1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268" exitCode=0 Oct 01 13:51:03 crc kubenswrapper[4774]: I1001 13:51:03.949340 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerDied","Data":"1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268"} Oct 01 13:51:03 crc kubenswrapper[4774]: I1001 13:51:03.949365 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerStarted","Data":"d34665643955231c2b7f26b1e52bf20648892a21f9e7cab3176cfccb9ba8aa90"} Oct 01 13:51:06 crc kubenswrapper[4774]: I1001 13:51:06.977096 4774 generic.go:334] "Generic (PLEG): container finished" podID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerID="38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d" exitCode=0 Oct 01 13:51:06 crc kubenswrapper[4774]: I1001 13:51:06.977152 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerDied","Data":"38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d"} Oct 01 13:51:08 crc kubenswrapper[4774]: I1001 13:51:08.087336 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:51:08 crc kubenswrapper[4774]: I1001 13:51:08.087719 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:51:08 crc kubenswrapper[4774]: I1001 13:51:08.129715 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:51:08 crc kubenswrapper[4774]: I1001 13:51:08.997161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerStarted","Data":"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4"} Oct 01 13:51:09 crc kubenswrapper[4774]: I1001 13:51:09.034097 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-7cfsk" Oct 01 13:51:09 crc kubenswrapper[4774]: I1001 13:51:09.050391 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t9v2f" podStartSLOduration=3.172944932 podStartE2EDuration="7.050371234s" podCreationTimestamp="2025-10-01 13:51:02 +0000 UTC" firstStartedPulling="2025-10-01 13:51:03.951752284 +0000 UTC m=+835.841382871" lastFinishedPulling="2025-10-01 13:51:07.829178536 +0000 UTC m=+839.718809173" observedRunningTime="2025-10-01 13:51:09.023729867 +0000 UTC m=+840.913360494" watchObservedRunningTime="2025-10-01 13:51:09.050371234 +0000 UTC m=+840.940001821" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.406800 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq"] Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.408668 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.418237 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cbnjv" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.425223 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq"] Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.520063 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.520158 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z472\" (UniqueName: \"kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.520196 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.621542 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.621616 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z472\" (UniqueName: \"kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.621649 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.622099 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.622124 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.648632 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z472\" (UniqueName: \"kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472\") pod \"e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:10 crc kubenswrapper[4774]: I1001 13:51:10.741977 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.171375 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.177959 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.188260 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.194143 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq"] Oct 01 13:51:11 crc kubenswrapper[4774]: W1001 13:51:11.203392 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda72155a5_47d5_48da_9b7c_e5b36d579a9d.slice/crio-7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725 WatchSource:0}: Error finding container 7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725: Status 404 returned error can't find the container with id 7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725 Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.229273 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.229428 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlfcl\" (UniqueName: \"kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.229520 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.331043 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.331134 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlfcl\" (UniqueName: \"kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.331175 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.331768 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.331847 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.361413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlfcl\" (UniqueName: \"kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl\") pod \"certified-operators-vddh5\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.523364 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:11 crc kubenswrapper[4774]: I1001 13:51:11.953196 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:11 crc kubenswrapper[4774]: W1001 13:51:11.965984 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47e58259_ec0b_497d_855e_97621054d33a.slice/crio-1e089a101cfc1e9bcde4105996d5b924fa721e2ca871de6b58588bc07e739310 WatchSource:0}: Error finding container 1e089a101cfc1e9bcde4105996d5b924fa721e2ca871de6b58588bc07e739310: Status 404 returned error can't find the container with id 1e089a101cfc1e9bcde4105996d5b924fa721e2ca871de6b58588bc07e739310 Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.025942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerStarted","Data":"1e089a101cfc1e9bcde4105996d5b924fa721e2ca871de6b58588bc07e739310"} Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.028138 4774 generic.go:334] "Generic (PLEG): container finished" podID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerID="c34ac1e2989938f18e7205472f8c3747ed1b9023eb81bb057b368a3313a8bcc3" exitCode=0 Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.028190 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" event={"ID":"a72155a5-47d5-48da-9b7c-e5b36d579a9d","Type":"ContainerDied","Data":"c34ac1e2989938f18e7205472f8c3747ed1b9023eb81bb057b368a3313a8bcc3"} Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.028221 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" event={"ID":"a72155a5-47d5-48da-9b7c-e5b36d579a9d","Type":"ContainerStarted","Data":"7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725"} Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.734548 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:12 crc kubenswrapper[4774]: I1001 13:51:12.735181 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:13 crc kubenswrapper[4774]: I1001 13:51:13.037105 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerStarted","Data":"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f"} Oct 01 13:51:13 crc kubenswrapper[4774]: I1001 13:51:13.709037 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-59tlr" Oct 01 13:51:13 crc kubenswrapper[4774]: I1001 13:51:13.787186 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t9v2f" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="registry-server" probeResult="failure" output=< Oct 01 13:51:13 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 01 13:51:13 crc kubenswrapper[4774]: > Oct 01 13:51:14 crc kubenswrapper[4774]: I1001 13:51:14.047680 4774 generic.go:334] "Generic (PLEG): container finished" podID="47e58259-ec0b-497d-855e-97621054d33a" containerID="755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f" exitCode=0 Oct 01 13:51:14 crc kubenswrapper[4774]: I1001 13:51:14.047799 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerDied","Data":"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f"} Oct 01 13:51:14 crc kubenswrapper[4774]: I1001 13:51:14.055383 4774 generic.go:334] "Generic (PLEG): container finished" podID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerID="03f35c0fc5b9029d2aa284d635265aefde22456847a76e835a0f5cef76f94c79" exitCode=0 Oct 01 13:51:14 crc kubenswrapper[4774]: I1001 13:51:14.055422 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" event={"ID":"a72155a5-47d5-48da-9b7c-e5b36d579a9d","Type":"ContainerDied","Data":"03f35c0fc5b9029d2aa284d635265aefde22456847a76e835a0f5cef76f94c79"} Oct 01 13:51:15 crc kubenswrapper[4774]: I1001 13:51:15.071689 4774 generic.go:334] "Generic (PLEG): container finished" podID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerID="efaba0405c58282c077db7c225b5847881c04e4bac76abb171a8e7db40105461" exitCode=0 Oct 01 13:51:15 crc kubenswrapper[4774]: I1001 13:51:15.071822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" event={"ID":"a72155a5-47d5-48da-9b7c-e5b36d579a9d","Type":"ContainerDied","Data":"efaba0405c58282c077db7c225b5847881c04e4bac76abb171a8e7db40105461"} Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.080144 4774 generic.go:334] "Generic (PLEG): container finished" podID="47e58259-ec0b-497d-855e-97621054d33a" containerID="dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d" exitCode=0 Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.080238 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerDied","Data":"dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d"} Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.380435 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.502130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z472\" (UniqueName: \"kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472\") pod \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.502177 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util\") pod \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.502231 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle\") pod \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\" (UID: \"a72155a5-47d5-48da-9b7c-e5b36d579a9d\") " Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.503487 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle" (OuterVolumeSpecName: "bundle") pod "a72155a5-47d5-48da-9b7c-e5b36d579a9d" (UID: "a72155a5-47d5-48da-9b7c-e5b36d579a9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.511734 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472" (OuterVolumeSpecName: "kube-api-access-2z472") pod "a72155a5-47d5-48da-9b7c-e5b36d579a9d" (UID: "a72155a5-47d5-48da-9b7c-e5b36d579a9d"). InnerVolumeSpecName "kube-api-access-2z472". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.514770 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util" (OuterVolumeSpecName: "util") pod "a72155a5-47d5-48da-9b7c-e5b36d579a9d" (UID: "a72155a5-47d5-48da-9b7c-e5b36d579a9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.603884 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z472\" (UniqueName: \"kubernetes.io/projected/a72155a5-47d5-48da-9b7c-e5b36d579a9d-kube-api-access-2z472\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.603921 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:16 crc kubenswrapper[4774]: I1001 13:51:16.603953 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72155a5-47d5-48da-9b7c-e5b36d579a9d-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:17 crc kubenswrapper[4774]: I1001 13:51:17.092234 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerStarted","Data":"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8"} Oct 01 13:51:17 crc kubenswrapper[4774]: I1001 13:51:17.096896 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" event={"ID":"a72155a5-47d5-48da-9b7c-e5b36d579a9d","Type":"ContainerDied","Data":"7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725"} Oct 01 13:51:17 crc kubenswrapper[4774]: I1001 13:51:17.096951 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7238ca976948cf3d1ebeb4a73b05decdf988739ec6fd7cfaf2dd3dde6ef8f725" Oct 01 13:51:17 crc kubenswrapper[4774]: I1001 13:51:17.097078 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq" Oct 01 13:51:17 crc kubenswrapper[4774]: I1001 13:51:17.122689 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vddh5" podStartSLOduration=3.535590091 podStartE2EDuration="6.122662838s" podCreationTimestamp="2025-10-01 13:51:11 +0000 UTC" firstStartedPulling="2025-10-01 13:51:14.049832896 +0000 UTC m=+845.939463503" lastFinishedPulling="2025-10-01 13:51:16.636905623 +0000 UTC m=+848.526536250" observedRunningTime="2025-10-01 13:51:17.119480379 +0000 UTC m=+849.009111006" watchObservedRunningTime="2025-10-01 13:51:17.122662838 +0000 UTC m=+849.012293465" Oct 01 13:51:21 crc kubenswrapper[4774]: I1001 13:51:21.524518 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:21 crc kubenswrapper[4774]: I1001 13:51:21.525094 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:21 crc kubenswrapper[4774]: I1001 13:51:21.575524 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.193816 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.748699 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt"] Oct 01 13:51:22 crc kubenswrapper[4774]: E1001 13:51:22.749002 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="extract" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.749019 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="extract" Oct 01 13:51:22 crc kubenswrapper[4774]: E1001 13:51:22.749045 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="pull" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.749053 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="pull" Oct 01 13:51:22 crc kubenswrapper[4774]: E1001 13:51:22.749071 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="util" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.749078 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="util" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.749214 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72155a5-47d5-48da-9b7c-e5b36d579a9d" containerName="extract" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.750003 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.753980 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.758727 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9n27n" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.795846 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt"] Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.797964 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-webhook-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.798026 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2q2x\" (UniqueName: \"kubernetes.io/projected/c2f00bec-2d63-49db-94dc-82a40bd0857c-kube-api-access-n2q2x\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.798104 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-apiservice-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.815412 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.877743 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.898866 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-webhook-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.898977 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2q2x\" (UniqueName: \"kubernetes.io/projected/c2f00bec-2d63-49db-94dc-82a40bd0857c-kube-api-access-n2q2x\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.899076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-apiservice-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.904102 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-apiservice-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.904239 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2f00bec-2d63-49db-94dc-82a40bd0857c-webhook-cert\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:22 crc kubenswrapper[4774]: I1001 13:51:22.919045 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2q2x\" (UniqueName: \"kubernetes.io/projected/c2f00bec-2d63-49db-94dc-82a40bd0857c-kube-api-access-n2q2x\") pod \"infra-operator-controller-manager-6dc4785855-lh6lt\" (UID: \"c2f00bec-2d63-49db-94dc-82a40bd0857c\") " pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:23 crc kubenswrapper[4774]: I1001 13:51:23.138192 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:23 crc kubenswrapper[4774]: I1001 13:51:23.588769 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt"] Oct 01 13:51:23 crc kubenswrapper[4774]: W1001 13:51:23.606155 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f00bec_2d63_49db_94dc_82a40bd0857c.slice/crio-fa4bb1e8fea4899c9c78507245f2196f01390ecd2318a2f0a325ea370e3a3a5f WatchSource:0}: Error finding container fa4bb1e8fea4899c9c78507245f2196f01390ecd2318a2f0a325ea370e3a3a5f: Status 404 returned error can't find the container with id fa4bb1e8fea4899c9c78507245f2196f01390ecd2318a2f0a325ea370e3a3a5f Oct 01 13:51:24 crc kubenswrapper[4774]: I1001 13:51:24.178899 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" event={"ID":"c2f00bec-2d63-49db-94dc-82a40bd0857c","Type":"ContainerStarted","Data":"fa4bb1e8fea4899c9c78507245f2196f01390ecd2318a2f0a325ea370e3a3a5f"} Oct 01 13:51:24 crc kubenswrapper[4774]: I1001 13:51:24.548730 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:24 crc kubenswrapper[4774]: I1001 13:51:24.548997 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vddh5" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="registry-server" containerID="cri-o://4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8" gracePeriod=2 Oct 01 13:51:24 crc kubenswrapper[4774]: I1001 13:51:24.958617 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.032788 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities\") pod \"47e58259-ec0b-497d-855e-97621054d33a\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.032846 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlfcl\" (UniqueName: \"kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl\") pod \"47e58259-ec0b-497d-855e-97621054d33a\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.032986 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content\") pod \"47e58259-ec0b-497d-855e-97621054d33a\" (UID: \"47e58259-ec0b-497d-855e-97621054d33a\") " Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.033853 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities" (OuterVolumeSpecName: "utilities") pod "47e58259-ec0b-497d-855e-97621054d33a" (UID: "47e58259-ec0b-497d-855e-97621054d33a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.041319 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl" (OuterVolumeSpecName: "kube-api-access-vlfcl") pod "47e58259-ec0b-497d-855e-97621054d33a" (UID: "47e58259-ec0b-497d-855e-97621054d33a"). InnerVolumeSpecName "kube-api-access-vlfcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.086271 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47e58259-ec0b-497d-855e-97621054d33a" (UID: "47e58259-ec0b-497d-855e-97621054d33a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.134733 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.134772 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e58259-ec0b-497d-855e-97621054d33a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.134783 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlfcl\" (UniqueName: \"kubernetes.io/projected/47e58259-ec0b-497d-855e-97621054d33a-kube-api-access-vlfcl\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.190109 4774 generic.go:334] "Generic (PLEG): container finished" podID="47e58259-ec0b-497d-855e-97621054d33a" containerID="4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8" exitCode=0 Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.190167 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vddh5" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.190157 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerDied","Data":"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8"} Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.190224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vddh5" event={"ID":"47e58259-ec0b-497d-855e-97621054d33a","Type":"ContainerDied","Data":"1e089a101cfc1e9bcde4105996d5b924fa721e2ca871de6b58588bc07e739310"} Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.190272 4774 scope.go:117] "RemoveContainer" containerID="4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.218831 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.221923 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vddh5"] Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.513154 4774 scope.go:117] "RemoveContainer" containerID="dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.532645 4774 scope.go:117] "RemoveContainer" containerID="755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.566611 4774 scope.go:117] "RemoveContainer" containerID="4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8" Oct 01 13:51:25 crc kubenswrapper[4774]: E1001 13:51:25.569051 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8\": container with ID starting with 4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8 not found: ID does not exist" containerID="4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.569092 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8"} err="failed to get container status \"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8\": rpc error: code = NotFound desc = could not find container \"4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8\": container with ID starting with 4add765c203863a8cf5f32745b8babbab08927bb683e093227bc63a4f98cccc8 not found: ID does not exist" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.569145 4774 scope.go:117] "RemoveContainer" containerID="dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d" Oct 01 13:51:25 crc kubenswrapper[4774]: E1001 13:51:25.569634 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d\": container with ID starting with dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d not found: ID does not exist" containerID="dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.569697 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d"} err="failed to get container status \"dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d\": rpc error: code = NotFound desc = could not find container \"dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d\": container with ID starting with dbcd9e182bb76dfe6bb260e0a21075d60c343a56286ab286162f0e8168db430d not found: ID does not exist" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.569756 4774 scope.go:117] "RemoveContainer" containerID="755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f" Oct 01 13:51:25 crc kubenswrapper[4774]: E1001 13:51:25.570245 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f\": container with ID starting with 755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f not found: ID does not exist" containerID="755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f" Oct 01 13:51:25 crc kubenswrapper[4774]: I1001 13:51:25.570276 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f"} err="failed to get container status \"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f\": rpc error: code = NotFound desc = could not find container \"755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f\": container with ID starting with 755ae32b470fe92aae0518cfc86c78f053eaff47532d6b43189b4f9394e0e50f not found: ID does not exist" Oct 01 13:51:26 crc kubenswrapper[4774]: I1001 13:51:26.200401 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" event={"ID":"c2f00bec-2d63-49db-94dc-82a40bd0857c","Type":"ContainerStarted","Data":"834b7d20649369bafbda65b1b525424b08086c280c0d21fccbbbcfccfcc85b98"} Oct 01 13:51:26 crc kubenswrapper[4774]: I1001 13:51:26.200690 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" event={"ID":"c2f00bec-2d63-49db-94dc-82a40bd0857c","Type":"ContainerStarted","Data":"cb3ad4afc0d02742915af51ab3b857a09d553a5358d721e6581f350999622ad1"} Oct 01 13:51:26 crc kubenswrapper[4774]: I1001 13:51:26.200747 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:26 crc kubenswrapper[4774]: I1001 13:51:26.222964 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" podStartSLOduration=2.179685093 podStartE2EDuration="4.222946923s" podCreationTimestamp="2025-10-01 13:51:22 +0000 UTC" firstStartedPulling="2025-10-01 13:51:23.611894112 +0000 UTC m=+855.501524709" lastFinishedPulling="2025-10-01 13:51:25.655155902 +0000 UTC m=+857.544786539" observedRunningTime="2025-10-01 13:51:26.221551731 +0000 UTC m=+858.111182358" watchObservedRunningTime="2025-10-01 13:51:26.222946923 +0000 UTC m=+858.112577530" Oct 01 13:51:26 crc kubenswrapper[4774]: I1001 13:51:26.876583 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e58259-ec0b-497d-855e-97621054d33a" path="/var/lib/kubelet/pods/47e58259-ec0b-497d-855e-97621054d33a/volumes" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.575627 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.575985 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t9v2f" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="registry-server" containerID="cri-o://cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4" gracePeriod=2 Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.930488 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 01 13:51:27 crc kubenswrapper[4774]: E1001 13:51:27.930972 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="registry-server" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.930983 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="registry-server" Oct 01 13:51:27 crc kubenswrapper[4774]: E1001 13:51:27.931003 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="extract-utilities" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.931009 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="extract-utilities" Oct 01 13:51:27 crc kubenswrapper[4774]: E1001 13:51:27.931016 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="extract-content" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.931023 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="extract-content" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.931119 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e58259-ec0b-497d-855e-97621054d33a" containerName="registry-server" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.931699 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.933850 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openshift-service-ca.crt" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.933903 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"osp-secret" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.934117 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"kube-root-ca.crt" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.934177 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-scripts" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.934217 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config-data" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.934377 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"galera-openstack-dockercfg-bp9sw" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.947393 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.950604 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.957542 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.958758 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.963211 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.969513 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.975719 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.977294 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr89v\" (UniqueName: \"kubernetes.io/projected/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kube-api-access-qr89v\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.978790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.978896 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.980082 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.980636 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-default\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.981251 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kolla-config\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.981429 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/1a4a45b8-6786-400a-ad17-d6318d1d3da6-secrets\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:27 crc kubenswrapper[4774]: I1001 13:51:27.987708 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.082759 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content\") pod \"b31655f7-4915-4a2c-8fa1-89990b19553b\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.082855 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities\") pod \"b31655f7-4915-4a2c-8fa1-89990b19553b\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.082924 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9brlp\" (UniqueName: \"kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp\") pod \"b31655f7-4915-4a2c-8fa1-89990b19553b\" (UID: \"b31655f7-4915-4a2c-8fa1-89990b19553b\") " Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083150 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083213 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv62l\" (UniqueName: \"kubernetes.io/projected/5660c969-322b-4ef6-a625-091735875ab7-kube-api-access-lv62l\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083251 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-kolla-config\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083283 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083309 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5660c969-322b-4ef6-a625-091735875ab7-secrets\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083356 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6b74e8cc-1edb-4f88-89be-672909669498-secrets\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083392 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr89v\" (UniqueName: \"kubernetes.io/projected/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kube-api-access-qr89v\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083423 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8kf\" (UniqueName: \"kubernetes.io/projected/6b74e8cc-1edb-4f88-89be-672909669498-kube-api-access-7n8kf\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083489 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083532 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083569 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083615 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b74e8cc-1edb-4f88-89be-672909669498-config-data-generated\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083644 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-config-data-default\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083673 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-default\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083719 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/1a4a45b8-6786-400a-ad17-d6318d1d3da6-secrets\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083759 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kolla-config\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083795 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-kolla-config\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083825 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-config-data-default\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083855 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5660c969-322b-4ef6-a625-091735875ab7-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083886 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-operator-scripts\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.083915 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.084141 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities" (OuterVolumeSpecName: "utilities") pod "b31655f7-4915-4a2c-8fa1-89990b19553b" (UID: "b31655f7-4915-4a2c-8fa1-89990b19553b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.084474 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") device mount path \"/mnt/openstack/pv09\"" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.089471 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kolla-config\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.090464 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.090989 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.091567 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp" (OuterVolumeSpecName: "kube-api-access-9brlp") pod "b31655f7-4915-4a2c-8fa1-89990b19553b" (UID: "b31655f7-4915-4a2c-8fa1-89990b19553b"). InnerVolumeSpecName "kube-api-access-9brlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.091667 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1a4a45b8-6786-400a-ad17-d6318d1d3da6-config-data-default\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.096535 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/1a4a45b8-6786-400a-ad17-d6318d1d3da6-secrets\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.115168 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr89v\" (UniqueName: \"kubernetes.io/projected/1a4a45b8-6786-400a-ad17-d6318d1d3da6-kube-api-access-qr89v\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.115785 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"1a4a45b8-6786-400a-ad17-d6318d1d3da6\") " pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.174522 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b31655f7-4915-4a2c-8fa1-89990b19553b" (UID: "b31655f7-4915-4a2c-8fa1-89990b19553b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185747 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185803 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b74e8cc-1edb-4f88-89be-672909669498-config-data-generated\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185821 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-config-data-default\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185851 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-kolla-config\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185868 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-config-data-default\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185883 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5660c969-322b-4ef6-a625-091735875ab7-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185902 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-operator-scripts\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185918 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185951 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv62l\" (UniqueName: \"kubernetes.io/projected/5660c969-322b-4ef6-a625-091735875ab7-kube-api-access-lv62l\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185973 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185976 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") device mount path \"/mnt/openstack/pv12\"" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.186421 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6b74e8cc-1edb-4f88-89be-672909669498-config-data-generated\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.186550 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5660c969-322b-4ef6-a625-091735875ab7-config-data-generated\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.187315 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-config-data-default\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.187339 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-kolla-config\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.187779 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") device mount path \"/mnt/openstack/pv05\"" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.187972 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-config-data-default\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.185987 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5660c969-322b-4ef6-a625-091735875ab7-secrets\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188019 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-operator-scripts\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188035 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-kolla-config\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188119 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6b74e8cc-1edb-4f88-89be-672909669498-secrets\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188156 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8kf\" (UniqueName: \"kubernetes.io/projected/6b74e8cc-1edb-4f88-89be-672909669498-kube-api-access-7n8kf\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188237 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188252 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b31655f7-4915-4a2c-8fa1-89990b19553b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188266 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9brlp\" (UniqueName: \"kubernetes.io/projected/b31655f7-4915-4a2c-8fa1-89990b19553b-kube-api-access-9brlp\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.188516 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5660c969-322b-4ef6-a625-091735875ab7-kolla-config\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.189126 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b74e8cc-1edb-4f88-89be-672909669498-operator-scripts\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.190204 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5660c969-322b-4ef6-a625-091735875ab7-secrets\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.197039 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/6b74e8cc-1edb-4f88-89be-672909669498-secrets\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.208575 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.211845 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.212794 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8kf\" (UniqueName: \"kubernetes.io/projected/6b74e8cc-1edb-4f88-89be-672909669498-kube-api-access-7n8kf\") pod \"openstack-galera-2\" (UID: \"6b74e8cc-1edb-4f88-89be-672909669498\") " pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.217226 4774 generic.go:334] "Generic (PLEG): container finished" podID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerID="cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4" exitCode=0 Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.217299 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerDied","Data":"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4"} Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.217786 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t9v2f" event={"ID":"b31655f7-4915-4a2c-8fa1-89990b19553b","Type":"ContainerDied","Data":"d34665643955231c2b7f26b1e52bf20648892a21f9e7cab3176cfccb9ba8aa90"} Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.217816 4774 scope.go:117] "RemoveContainer" containerID="cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.217348 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t9v2f" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.230630 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv62l\" (UniqueName: \"kubernetes.io/projected/5660c969-322b-4ef6-a625-091735875ab7-kube-api-access-lv62l\") pod \"openstack-galera-1\" (UID: \"5660c969-322b-4ef6-a625-091735875ab7\") " pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.243980 4774 scope.go:117] "RemoveContainer" containerID="38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.246949 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.252114 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t9v2f"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.273778 4774 scope.go:117] "RemoveContainer" containerID="1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.278269 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.298986 4774 scope.go:117] "RemoveContainer" containerID="cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.299239 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:28 crc kubenswrapper[4774]: E1001 13:51:28.306990 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4\": container with ID starting with cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4 not found: ID does not exist" containerID="cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.307105 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4"} err="failed to get container status \"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4\": rpc error: code = NotFound desc = could not find container \"cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4\": container with ID starting with cdac6c5da1271132ed4edb86f8fbb8b10b7989c8d80879cd8b95ca50959d4ec4 not found: ID does not exist" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.307142 4774 scope.go:117] "RemoveContainer" containerID="38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.307559 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:28 crc kubenswrapper[4774]: E1001 13:51:28.307709 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d\": container with ID starting with 38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d not found: ID does not exist" containerID="38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.307770 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d"} err="failed to get container status \"38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d\": rpc error: code = NotFound desc = could not find container \"38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d\": container with ID starting with 38f2b16bdebdabd4581fcc9c07e80cf17709a62763c83464a0aa8e6e26c1f57d not found: ID does not exist" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.307799 4774 scope.go:117] "RemoveContainer" containerID="1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268" Oct 01 13:51:28 crc kubenswrapper[4774]: E1001 13:51:28.309065 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268\": container with ID starting with 1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268 not found: ID does not exist" containerID="1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.309128 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268"} err="failed to get container status \"1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268\": rpc error: code = NotFound desc = could not find container \"1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268\": container with ID starting with 1ff63a4995cd0d34134c1941053710b6ca4e95f94b928e07ebc339e108e7d268 not found: ID does not exist" Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.779030 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-0"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.782216 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-2"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.798629 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstack-galera-1"] Oct 01 13:51:28 crc kubenswrapper[4774]: I1001 13:51:28.877137 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" path="/var/lib/kubelet/pods/b31655f7-4915-4a2c-8fa1-89990b19553b/volumes" Oct 01 13:51:29 crc kubenswrapper[4774]: I1001 13:51:29.231849 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"6b74e8cc-1edb-4f88-89be-672909669498","Type":"ContainerStarted","Data":"17f0dbaa20eb3ef66bae29e4c284a6897fedaa6de5e17d28e6567200a8072075"} Oct 01 13:51:29 crc kubenswrapper[4774]: I1001 13:51:29.233224 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1a4a45b8-6786-400a-ad17-d6318d1d3da6","Type":"ContainerStarted","Data":"8d3f563310197a37173785cc22d830d2bb070643d48e9bded67de1c06c219552"} Oct 01 13:51:29 crc kubenswrapper[4774]: I1001 13:51:29.235293 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"5660c969-322b-4ef6-a625-091735875ab7","Type":"ContainerStarted","Data":"67a6cfe6dfe5e10fb9c6b5be131eb931ebe958dd00afecdfc07b5a5e90ea2772"} Oct 01 13:51:33 crc kubenswrapper[4774]: I1001 13:51:33.150792 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6dc4785855-lh6lt" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.051948 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 01 13:51:35 crc kubenswrapper[4774]: E1001 13:51:35.052525 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="registry-server" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.052544 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="registry-server" Oct 01 13:51:35 crc kubenswrapper[4774]: E1001 13:51:35.052582 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="extract-utilities" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.052591 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="extract-utilities" Oct 01 13:51:35 crc kubenswrapper[4774]: E1001 13:51:35.052603 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="extract-content" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.052610 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="extract-content" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.052746 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31655f7-4915-4a2c-8fa1-89990b19553b" containerName="registry-server" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.053351 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.057540 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"memcached-config-data" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.061631 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"memcached-memcached-dockercfg-zq99v" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.063484 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.096167 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-config-data\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.096217 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-kolla-config\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.096254 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzh5q\" (UniqueName: \"kubernetes.io/projected/beae5224-51a1-4e93-9381-a10808afc6c1-kube-api-access-rzh5q\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.197764 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-config-data\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.197808 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-kolla-config\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.197846 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzh5q\" (UniqueName: \"kubernetes.io/projected/beae5224-51a1-4e93-9381-a10808afc6c1-kube-api-access-rzh5q\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.199718 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-config-data\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.200243 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/beae5224-51a1-4e93-9381-a10808afc6c1-kolla-config\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.231149 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzh5q\" (UniqueName: \"kubernetes.io/projected/beae5224-51a1-4e93-9381-a10808afc6c1-kube-api-access-rzh5q\") pod \"memcached-0\" (UID: \"beae5224-51a1-4e93-9381-a10808afc6c1\") " pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:35 crc kubenswrapper[4774]: I1001 13:51:35.379050 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:38 crc kubenswrapper[4774]: I1001 13:51:38.499439 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/memcached-0"] Oct 01 13:51:38 crc kubenswrapper[4774]: W1001 13:51:38.504321 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbeae5224_51a1_4e93_9381_a10808afc6c1.slice/crio-af774cb6d632726d3b1f52d73a508e524386c8f0dc31d00e95fb925495997a78 WatchSource:0}: Error finding container af774cb6d632726d3b1f52d73a508e524386c8f0dc31d00e95fb925495997a78: Status 404 returned error can't find the container with id af774cb6d632726d3b1f52d73a508e524386c8f0dc31d00e95fb925495997a78 Oct 01 13:51:38 crc kubenswrapper[4774]: I1001 13:51:38.959214 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-l72d5"] Oct 01 13:51:38 crc kubenswrapper[4774]: I1001 13:51:38.960024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:38 crc kubenswrapper[4774]: I1001 13:51:38.964901 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-w5nqt" Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.029555 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-l72d5"] Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.060427 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgjnz\" (UniqueName: \"kubernetes.io/projected/1fc5545f-e6d0-4cd1-9abf-44138f6dc054-kube-api-access-vgjnz\") pod \"rabbitmq-cluster-operator-index-l72d5\" (UID: \"1fc5545f-e6d0-4cd1-9abf-44138f6dc054\") " pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.161855 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgjnz\" (UniqueName: \"kubernetes.io/projected/1fc5545f-e6d0-4cd1-9abf-44138f6dc054-kube-api-access-vgjnz\") pod \"rabbitmq-cluster-operator-index-l72d5\" (UID: \"1fc5545f-e6d0-4cd1-9abf-44138f6dc054\") " pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.196367 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgjnz\" (UniqueName: \"kubernetes.io/projected/1fc5545f-e6d0-4cd1-9abf-44138f6dc054-kube-api-access-vgjnz\") pod \"rabbitmq-cluster-operator-index-l72d5\" (UID: \"1fc5545f-e6d0-4cd1-9abf-44138f6dc054\") " pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.277709 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.346484 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"5660c969-322b-4ef6-a625-091735875ab7","Type":"ContainerStarted","Data":"c2dc38c2c0c5d89391509e5a70013e8bed0c99417742965409c7d0ffbaab6d37"} Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.348194 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"beae5224-51a1-4e93-9381-a10808afc6c1","Type":"ContainerStarted","Data":"af774cb6d632726d3b1f52d73a508e524386c8f0dc31d00e95fb925495997a78"} Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.349961 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"6b74e8cc-1edb-4f88-89be-672909669498","Type":"ContainerStarted","Data":"a2a184fbf2e757b5055ade2714fbfdef87cea0f2cb349e47cf4a167a69711156"} Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.355442 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1a4a45b8-6786-400a-ad17-d6318d1d3da6","Type":"ContainerStarted","Data":"24ba0a52e2e8226fd1af99d76f9bc03954f16458c706feb42a0d88129e348596"} Oct 01 13:51:39 crc kubenswrapper[4774]: I1001 13:51:39.572593 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-l72d5"] Oct 01 13:51:39 crc kubenswrapper[4774]: W1001 13:51:39.586798 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc5545f_e6d0_4cd1_9abf_44138f6dc054.slice/crio-0982b562ec89178800bc1160ff87628392378c6469467484c36793c7983ab7da WatchSource:0}: Error finding container 0982b562ec89178800bc1160ff87628392378c6469467484c36793c7983ab7da: Status 404 returned error can't find the container with id 0982b562ec89178800bc1160ff87628392378c6469467484c36793c7983ab7da Oct 01 13:51:40 crc kubenswrapper[4774]: I1001 13:51:40.369790 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" event={"ID":"1fc5545f-e6d0-4cd1-9abf-44138f6dc054","Type":"ContainerStarted","Data":"0982b562ec89178800bc1160ff87628392378c6469467484c36793c7983ab7da"} Oct 01 13:51:41 crc kubenswrapper[4774]: I1001 13:51:41.377870 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/memcached-0" event={"ID":"beae5224-51a1-4e93-9381-a10808afc6c1","Type":"ContainerStarted","Data":"d335012fb051329cadaff7acabf117b78f14b14293159ef3167ac769da4bca8c"} Oct 01 13:51:41 crc kubenswrapper[4774]: I1001 13:51:41.378269 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:41 crc kubenswrapper[4774]: I1001 13:51:41.395724 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/memcached-0" podStartSLOduration=3.695646187 podStartE2EDuration="6.395708354s" podCreationTimestamp="2025-10-01 13:51:35 +0000 UTC" firstStartedPulling="2025-10-01 13:51:38.50637939 +0000 UTC m=+870.396009987" lastFinishedPulling="2025-10-01 13:51:41.206441547 +0000 UTC m=+873.096072154" observedRunningTime="2025-10-01 13:51:41.394029548 +0000 UTC m=+873.283660155" watchObservedRunningTime="2025-10-01 13:51:41.395708354 +0000 UTC m=+873.285338951" Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.387095 4774 generic.go:334] "Generic (PLEG): container finished" podID="1a4a45b8-6786-400a-ad17-d6318d1d3da6" containerID="24ba0a52e2e8226fd1af99d76f9bc03954f16458c706feb42a0d88129e348596" exitCode=0 Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.387485 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1a4a45b8-6786-400a-ad17-d6318d1d3da6","Type":"ContainerDied","Data":"24ba0a52e2e8226fd1af99d76f9bc03954f16458c706feb42a0d88129e348596"} Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.390703 4774 generic.go:334] "Generic (PLEG): container finished" podID="5660c969-322b-4ef6-a625-091735875ab7" containerID="c2dc38c2c0c5d89391509e5a70013e8bed0c99417742965409c7d0ffbaab6d37" exitCode=0 Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.390756 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"5660c969-322b-4ef6-a625-091735875ab7","Type":"ContainerDied","Data":"c2dc38c2c0c5d89391509e5a70013e8bed0c99417742965409c7d0ffbaab6d37"} Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.392673 4774 generic.go:334] "Generic (PLEG): container finished" podID="6b74e8cc-1edb-4f88-89be-672909669498" containerID="a2a184fbf2e757b5055ade2714fbfdef87cea0f2cb349e47cf4a167a69711156" exitCode=0 Oct 01 13:51:42 crc kubenswrapper[4774]: I1001 13:51:42.392728 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"6b74e8cc-1edb-4f88-89be-672909669498","Type":"ContainerDied","Data":"a2a184fbf2e757b5055ade2714fbfdef87cea0f2cb349e47cf4a167a69711156"} Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.412022 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-1" event={"ID":"5660c969-322b-4ef6-a625-091735875ab7","Type":"ContainerStarted","Data":"2e4a6284a50ec1f80d5a6871aaf6b7c9762b8e9750a4210af8ee839d6cb31a31"} Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.413967 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" event={"ID":"1fc5545f-e6d0-4cd1-9abf-44138f6dc054","Type":"ContainerStarted","Data":"872457387cf5f4562767a6f6829e362e5c6663dca83a129c2de5d482a4c7e67a"} Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.417080 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-2" event={"ID":"6b74e8cc-1edb-4f88-89be-672909669498","Type":"ContainerStarted","Data":"8b999288b47198845bd7b6a82d64b1df537fff09fe1952c30cd843a1afcd3b91"} Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.419791 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/openstack-galera-0" event={"ID":"1a4a45b8-6786-400a-ad17-d6318d1d3da6","Type":"ContainerStarted","Data":"8ccaea5acea4131d8f04091ae9d3412b9cac92abb2c766612d7ddd204f8de2fc"} Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.442059 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-1" podStartSLOduration=9.099456873 podStartE2EDuration="18.442034261s" podCreationTimestamp="2025-10-01 13:51:26 +0000 UTC" firstStartedPulling="2025-10-01 13:51:28.816503323 +0000 UTC m=+860.706133920" lastFinishedPulling="2025-10-01 13:51:38.159080671 +0000 UTC m=+870.048711308" observedRunningTime="2025-10-01 13:51:44.436596436 +0000 UTC m=+876.326227063" watchObservedRunningTime="2025-10-01 13:51:44.442034261 +0000 UTC m=+876.331664888" Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.465249 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-2" podStartSLOduration=9.129873856 podStartE2EDuration="18.465227872s" podCreationTimestamp="2025-10-01 13:51:26 +0000 UTC" firstStartedPulling="2025-10-01 13:51:28.796312908 +0000 UTC m=+860.685943505" lastFinishedPulling="2025-10-01 13:51:38.131666904 +0000 UTC m=+870.021297521" observedRunningTime="2025-10-01 13:51:44.463998483 +0000 UTC m=+876.353629130" watchObservedRunningTime="2025-10-01 13:51:44.465227872 +0000 UTC m=+876.354858479" Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.492253 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" podStartSLOduration=2.441817367 podStartE2EDuration="6.492219942s" podCreationTimestamp="2025-10-01 13:51:38 +0000 UTC" firstStartedPulling="2025-10-01 13:51:39.59737173 +0000 UTC m=+871.487002327" lastFinishedPulling="2025-10-01 13:51:43.647774305 +0000 UTC m=+875.537404902" observedRunningTime="2025-10-01 13:51:44.488774339 +0000 UTC m=+876.378404986" watchObservedRunningTime="2025-10-01 13:51:44.492219942 +0000 UTC m=+876.381850579" Oct 01 13:51:44 crc kubenswrapper[4774]: I1001 13:51:44.517739 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/openstack-galera-0" podStartSLOduration=9.123486146 podStartE2EDuration="18.517720759s" podCreationTimestamp="2025-10-01 13:51:26 +0000 UTC" firstStartedPulling="2025-10-01 13:51:28.796374029 +0000 UTC m=+860.686004636" lastFinishedPulling="2025-10-01 13:51:38.190608652 +0000 UTC m=+870.080239249" observedRunningTime="2025-10-01 13:51:44.514208295 +0000 UTC m=+876.403838902" watchObservedRunningTime="2025-10-01 13:51:44.517720759 +0000 UTC m=+876.407351356" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.279417 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.280899 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.300199 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.300273 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.308160 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:48 crc kubenswrapper[4774]: I1001 13:51:48.308192 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:51:49 crc kubenswrapper[4774]: I1001 13:51:49.277943 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:49 crc kubenswrapper[4774]: I1001 13:51:49.278001 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:49 crc kubenswrapper[4774]: I1001 13:51:49.339177 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:49 crc kubenswrapper[4774]: I1001 13:51:49.477249 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-l72d5" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.356304 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.381649 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/memcached-0" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.443036 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-2" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.787939 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg"] Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.789385 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.795885 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg"] Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.798764 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cbnjv" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.929947 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.930011 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvz2p\" (UniqueName: \"kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:50 crc kubenswrapper[4774]: I1001 13:51:50.930066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.031126 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvz2p\" (UniqueName: \"kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.031194 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.031289 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.031855 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.032186 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.053266 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvz2p\" (UniqueName: \"kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.107756 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:51 crc kubenswrapper[4774]: I1001 13:51:51.546360 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg"] Oct 01 13:51:52 crc kubenswrapper[4774]: I1001 13:51:52.484227 4774 generic.go:334] "Generic (PLEG): container finished" podID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerID="ad16b2d8776aa9a08532ef5edb6519a8caec6d230179687904401fb29e4b5b0b" exitCode=0 Oct 01 13:51:52 crc kubenswrapper[4774]: I1001 13:51:52.484605 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" event={"ID":"64e22a11-410d-4091-bee5-f6d2ab9baa83","Type":"ContainerDied","Data":"ad16b2d8776aa9a08532ef5edb6519a8caec6d230179687904401fb29e4b5b0b"} Oct 01 13:51:52 crc kubenswrapper[4774]: I1001 13:51:52.484646 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" event={"ID":"64e22a11-410d-4091-bee5-f6d2ab9baa83","Type":"ContainerStarted","Data":"2c98c14505438a3883cf611ec364826fce0c500831a096cbde7a7dc5bd6f412f"} Oct 01 13:51:53 crc kubenswrapper[4774]: I1001 13:51:53.493920 4774 generic.go:334] "Generic (PLEG): container finished" podID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerID="8cd2aeb4ce6f808aef9bc197a155266e6fedd47388df9671b83eec796ffa99ce" exitCode=0 Oct 01 13:51:53 crc kubenswrapper[4774]: I1001 13:51:53.494024 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" event={"ID":"64e22a11-410d-4091-bee5-f6d2ab9baa83","Type":"ContainerDied","Data":"8cd2aeb4ce6f808aef9bc197a155266e6fedd47388df9671b83eec796ffa99ce"} Oct 01 13:51:54 crc kubenswrapper[4774]: I1001 13:51:54.522368 4774 generic.go:334] "Generic (PLEG): container finished" podID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerID="95855de27eca42552171574f81a0f9ce7fb7e1a6128a08758e0b8ad02a6e4b2c" exitCode=0 Oct 01 13:51:54 crc kubenswrapper[4774]: I1001 13:51:54.522742 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" event={"ID":"64e22a11-410d-4091-bee5-f6d2ab9baa83","Type":"ContainerDied","Data":"95855de27eca42552171574f81a0f9ce7fb7e1a6128a08758e0b8ad02a6e4b2c"} Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.541575 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" event={"ID":"64e22a11-410d-4091-bee5-f6d2ab9baa83","Type":"ContainerDied","Data":"2c98c14505438a3883cf611ec364826fce0c500831a096cbde7a7dc5bd6f412f"} Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.541798 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c98c14505438a3883cf611ec364826fce0c500831a096cbde7a7dc5bd6f412f" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.587480 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.717494 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvz2p\" (UniqueName: \"kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p\") pod \"64e22a11-410d-4091-bee5-f6d2ab9baa83\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.717831 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle\") pod \"64e22a11-410d-4091-bee5-f6d2ab9baa83\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.717980 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util\") pod \"64e22a11-410d-4091-bee5-f6d2ab9baa83\" (UID: \"64e22a11-410d-4091-bee5-f6d2ab9baa83\") " Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.719264 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle" (OuterVolumeSpecName: "bundle") pod "64e22a11-410d-4091-bee5-f6d2ab9baa83" (UID: "64e22a11-410d-4091-bee5-f6d2ab9baa83"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.725599 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p" (OuterVolumeSpecName: "kube-api-access-lvz2p") pod "64e22a11-410d-4091-bee5-f6d2ab9baa83" (UID: "64e22a11-410d-4091-bee5-f6d2ab9baa83"). InnerVolumeSpecName "kube-api-access-lvz2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.733919 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util" (OuterVolumeSpecName: "util") pod "64e22a11-410d-4091-bee5-f6d2ab9baa83" (UID: "64e22a11-410d-4091-bee5-f6d2ab9baa83"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.819967 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.820015 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvz2p\" (UniqueName: \"kubernetes.io/projected/64e22a11-410d-4091-bee5-f6d2ab9baa83-kube-api-access-lvz2p\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:56 crc kubenswrapper[4774]: I1001 13:51:56.820035 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/64e22a11-410d-4091-bee5-f6d2ab9baa83-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:57 crc kubenswrapper[4774]: I1001 13:51:57.549484 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg" Oct 01 13:51:58 crc kubenswrapper[4774]: I1001 13:51:58.360646 4774 prober.go:107] "Probe failed" probeType="Readiness" pod="keystone-kuttl-tests/openstack-galera-2" podUID="6b74e8cc-1edb-4f88-89be-672909669498" containerName="galera" probeResult="failure" output=< Oct 01 13:51:58 crc kubenswrapper[4774]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Oct 01 13:51:58 crc kubenswrapper[4774]: > Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.356367 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.439220 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-1" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676099 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg"] Oct 01 13:52:03 crc kubenswrapper[4774]: E1001 13:52:03.676332 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="pull" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676345 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="pull" Oct 01 13:52:03 crc kubenswrapper[4774]: E1001 13:52:03.676358 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="extract" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676365 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="extract" Oct 01 13:52:03 crc kubenswrapper[4774]: E1001 13:52:03.676379 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="util" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676385 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="util" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676527 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e22a11-410d-4091-bee5-f6d2ab9baa83" containerName="extract" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.676972 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.678887 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-tzfxk" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.690258 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg"] Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.826922 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79kt4\" (UniqueName: \"kubernetes.io/projected/0a1bac53-f36d-4a76-a0c6-b19a17eb25f4-kube-api-access-79kt4\") pod \"rabbitmq-cluster-operator-779fc9694b-d2hdg\" (UID: \"0a1bac53-f36d-4a76-a0c6-b19a17eb25f4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.928563 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79kt4\" (UniqueName: \"kubernetes.io/projected/0a1bac53-f36d-4a76-a0c6-b19a17eb25f4-kube-api-access-79kt4\") pod \"rabbitmq-cluster-operator-779fc9694b-d2hdg\" (UID: \"0a1bac53-f36d-4a76-a0c6-b19a17eb25f4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" Oct 01 13:52:03 crc kubenswrapper[4774]: I1001 13:52:03.954484 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79kt4\" (UniqueName: \"kubernetes.io/projected/0a1bac53-f36d-4a76-a0c6-b19a17eb25f4-kube-api-access-79kt4\") pod \"rabbitmq-cluster-operator-779fc9694b-d2hdg\" (UID: \"0a1bac53-f36d-4a76-a0c6-b19a17eb25f4\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" Oct 01 13:52:04 crc kubenswrapper[4774]: I1001 13:52:04.008312 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" Oct 01 13:52:04 crc kubenswrapper[4774]: I1001 13:52:04.282415 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg"] Oct 01 13:52:04 crc kubenswrapper[4774]: I1001 13:52:04.616885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" event={"ID":"0a1bac53-f36d-4a76-a0c6-b19a17eb25f4","Type":"ContainerStarted","Data":"6c2eaaa07131fbc1075439fcf473e3d037e87e485a9f5bc092075c03ec2a1176"} Oct 01 13:52:04 crc kubenswrapper[4774]: I1001 13:52:04.771182 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:52:04 crc kubenswrapper[4774]: I1001 13:52:04.827371 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/openstack-galera-0" Oct 01 13:52:06 crc kubenswrapper[4774]: I1001 13:52:06.636286 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" event={"ID":"0a1bac53-f36d-4a76-a0c6-b19a17eb25f4","Type":"ContainerStarted","Data":"927356ca5fe773828a9ad7efeeb9b6afa60139c5ca13861b9f4a4779e1e76d71"} Oct 01 13:52:06 crc kubenswrapper[4774]: I1001 13:52:06.664699 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-d2hdg" podStartSLOduration=1.914743021 podStartE2EDuration="3.664675457s" podCreationTimestamp="2025-10-01 13:52:03 +0000 UTC" firstStartedPulling="2025-10-01 13:52:04.289381946 +0000 UTC m=+896.179012543" lastFinishedPulling="2025-10-01 13:52:06.039314352 +0000 UTC m=+897.928944979" observedRunningTime="2025-10-01 13:52:06.658673105 +0000 UTC m=+898.548303732" watchObservedRunningTime="2025-10-01 13:52:06.664675457 +0000 UTC m=+898.554306084" Oct 01 13:52:07 crc kubenswrapper[4774]: I1001 13:52:07.271211 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:52:07 crc kubenswrapper[4774]: I1001 13:52:07.271302 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.314656 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.317874 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.322225 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-default-user" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.323589 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-server-conf" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.324258 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-server-dockercfg-2gwjt" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.324619 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"rabbitmq-plugins-conf" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.326173 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.336859 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"rabbitmq-erlang-cookie" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.483870 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.483934 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-92493280-bc1f-49cf-b90f-cbab58624046\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92493280-bc1f-49cf-b90f-cbab58624046\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.483963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5be2fd22-c494-44f3-889d-43561b4bfa34-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.483984 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5be2fd22-c494-44f3-889d-43561b4bfa34-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.484031 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbsj\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-kube-api-access-dtbsj\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.484066 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5be2fd22-c494-44f3-889d-43561b4bfa34-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.484183 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.484266 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.586080 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5be2fd22-c494-44f3-889d-43561b4bfa34-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.586481 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.586672 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.586852 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587037 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-92493280-bc1f-49cf-b90f-cbab58624046\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92493280-bc1f-49cf-b90f-cbab58624046\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587199 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5be2fd22-c494-44f3-889d-43561b4bfa34-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587337 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5be2fd22-c494-44f3-889d-43561b4bfa34-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587431 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587632 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.587844 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbsj\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-kube-api-access-dtbsj\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.588672 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5be2fd22-c494-44f3-889d-43561b4bfa34-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.596966 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5be2fd22-c494-44f3-889d-43561b4bfa34-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.597014 4774 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.597328 4774 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-92493280-bc1f-49cf-b90f-cbab58624046\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92493280-bc1f-49cf-b90f-cbab58624046\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/58a60e46b2bd0f178d344bcf69fa6e2ca26902be8b0e3267a5499ccf22ff6180/globalmount\"" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.601341 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5be2fd22-c494-44f3-889d-43561b4bfa34-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.601859 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.631539 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbsj\" (UniqueName: \"kubernetes.io/projected/5be2fd22-c494-44f3-889d-43561b4bfa34-kube-api-access-dtbsj\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.639329 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-92493280-bc1f-49cf-b90f-cbab58624046\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92493280-bc1f-49cf-b90f-cbab58624046\") pod \"rabbitmq-server-0\" (UID: \"5be2fd22-c494-44f3-889d-43561b4bfa34\") " pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:12 crc kubenswrapper[4774]: I1001 13:52:12.673884 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.125928 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/rabbitmq-server-0"] Oct 01 13:52:13 crc kubenswrapper[4774]: W1001 13:52:13.133874 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be2fd22_c494_44f3_889d_43561b4bfa34.slice/crio-29b38a8c5cabad342425ca90c130330b302743673d25f7f6d17e645df0a642e2 WatchSource:0}: Error finding container 29b38a8c5cabad342425ca90c130330b302743673d25f7f6d17e645df0a642e2: Status 404 returned error can't find the container with id 29b38a8c5cabad342425ca90c130330b302743673d25f7f6d17e645df0a642e2 Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.709730 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"5be2fd22-c494-44f3-889d-43561b4bfa34","Type":"ContainerStarted","Data":"29b38a8c5cabad342425ca90c130330b302743673d25f7f6d17e645df0a642e2"} Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.961616 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-z58lt"] Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.962784 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.964592 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-tknx6" Oct 01 13:52:13 crc kubenswrapper[4774]: I1001 13:52:13.969724 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-z58lt"] Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.110191 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqpxw\" (UniqueName: \"kubernetes.io/projected/bcac7839-f573-479d-8139-21163dd1fd20-kube-api-access-bqpxw\") pod \"keystone-operator-index-z58lt\" (UID: \"bcac7839-f573-479d-8139-21163dd1fd20\") " pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.212131 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqpxw\" (UniqueName: \"kubernetes.io/projected/bcac7839-f573-479d-8139-21163dd1fd20-kube-api-access-bqpxw\") pod \"keystone-operator-index-z58lt\" (UID: \"bcac7839-f573-479d-8139-21163dd1fd20\") " pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.238528 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqpxw\" (UniqueName: \"kubernetes.io/projected/bcac7839-f573-479d-8139-21163dd1fd20-kube-api-access-bqpxw\") pod \"keystone-operator-index-z58lt\" (UID: \"bcac7839-f573-479d-8139-21163dd1fd20\") " pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.292572 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.690173 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-z58lt"] Oct 01 13:52:14 crc kubenswrapper[4774]: W1001 13:52:14.701019 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcac7839_f573_479d_8139_21163dd1fd20.slice/crio-f9a49c5ed5c290e1251f6fa099b0e7f1624b04c5cf2c7df90e949ac4c772a7fc WatchSource:0}: Error finding container f9a49c5ed5c290e1251f6fa099b0e7f1624b04c5cf2c7df90e949ac4c772a7fc: Status 404 returned error can't find the container with id f9a49c5ed5c290e1251f6fa099b0e7f1624b04c5cf2c7df90e949ac4c772a7fc Oct 01 13:52:14 crc kubenswrapper[4774]: I1001 13:52:14.716759 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-z58lt" event={"ID":"bcac7839-f573-479d-8139-21163dd1fd20","Type":"ContainerStarted","Data":"f9a49c5ed5c290e1251f6fa099b0e7f1624b04c5cf2c7df90e949ac4c772a7fc"} Oct 01 13:52:17 crc kubenswrapper[4774]: I1001 13:52:17.744428 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-z58lt" event={"ID":"bcac7839-f573-479d-8139-21163dd1fd20","Type":"ContainerStarted","Data":"30fe2a7c1b0ccbe8673d2dce6eccf1f694895a19ed0d07bdfe0400bd7b812677"} Oct 01 13:52:17 crc kubenswrapper[4774]: I1001 13:52:17.765321 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-z58lt" podStartSLOduration=2.720445164 podStartE2EDuration="4.765304358s" podCreationTimestamp="2025-10-01 13:52:13 +0000 UTC" firstStartedPulling="2025-10-01 13:52:14.703310798 +0000 UTC m=+906.592941395" lastFinishedPulling="2025-10-01 13:52:16.748169992 +0000 UTC m=+908.637800589" observedRunningTime="2025-10-01 13:52:17.762826391 +0000 UTC m=+909.652456998" watchObservedRunningTime="2025-10-01 13:52:17.765304358 +0000 UTC m=+909.654934965" Oct 01 13:52:22 crc kubenswrapper[4774]: I1001 13:52:22.801074 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"5be2fd22-c494-44f3-889d-43561b4bfa34","Type":"ContainerStarted","Data":"284a931e330ed302556ecc71553de34f0dad2f54f5ec5401041f111741bcd191"} Oct 01 13:52:24 crc kubenswrapper[4774]: I1001 13:52:24.293680 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:24 crc kubenswrapper[4774]: I1001 13:52:24.293771 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:24 crc kubenswrapper[4774]: I1001 13:52:24.325376 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:24 crc kubenswrapper[4774]: I1001 13:52:24.897532 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-z58lt" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.617583 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7"] Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.628925 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.633812 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-cbnjv" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.648340 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7"] Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.705147 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.705278 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqj2\" (UniqueName: \"kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.705474 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.806896 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.806974 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqj2\" (UniqueName: \"kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.807594 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.807706 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.808000 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.843312 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqj2\" (UniqueName: \"kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2\") pod \"fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:26 crc kubenswrapper[4774]: I1001 13:52:26.966545 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:27 crc kubenswrapper[4774]: I1001 13:52:27.234555 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7"] Oct 01 13:52:27 crc kubenswrapper[4774]: I1001 13:52:27.843758 4774 generic.go:334] "Generic (PLEG): container finished" podID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerID="aa135f9e4dbf1e98806baf1d45cf113ff657208258c87990c7ac7f955756ed9d" exitCode=0 Oct 01 13:52:27 crc kubenswrapper[4774]: I1001 13:52:27.843984 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" event={"ID":"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf","Type":"ContainerDied","Data":"aa135f9e4dbf1e98806baf1d45cf113ff657208258c87990c7ac7f955756ed9d"} Oct 01 13:52:27 crc kubenswrapper[4774]: I1001 13:52:27.844047 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" event={"ID":"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf","Type":"ContainerStarted","Data":"bebbe86b3302f1fa1fecbb31ceb7d8a1d5f22707dcd1f8c84c904ae36c01dbaa"} Oct 01 13:52:28 crc kubenswrapper[4774]: I1001 13:52:28.856421 4774 generic.go:334] "Generic (PLEG): container finished" podID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerID="0d3c00749e28b19db3534e008b992540bfe21fd8fc1609187eb1ea46f5e0afdd" exitCode=0 Oct 01 13:52:28 crc kubenswrapper[4774]: I1001 13:52:28.857631 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" event={"ID":"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf","Type":"ContainerDied","Data":"0d3c00749e28b19db3534e008b992540bfe21fd8fc1609187eb1ea46f5e0afdd"} Oct 01 13:52:29 crc kubenswrapper[4774]: I1001 13:52:29.874805 4774 generic.go:334] "Generic (PLEG): container finished" podID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerID="18cf826660cc532a3c1a5d1d41783bfbfdaae33a9054a2c5c2a872fb7f77e57d" exitCode=0 Oct 01 13:52:29 crc kubenswrapper[4774]: I1001 13:52:29.874885 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" event={"ID":"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf","Type":"ContainerDied","Data":"18cf826660cc532a3c1a5d1d41783bfbfdaae33a9054a2c5c2a872fb7f77e57d"} Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.268764 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.375745 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle\") pod \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.375796 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util\") pod \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.375860 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqj2\" (UniqueName: \"kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2\") pod \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\" (UID: \"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf\") " Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.376860 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle" (OuterVolumeSpecName: "bundle") pod "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" (UID: "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.385730 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2" (OuterVolumeSpecName: "kube-api-access-bpqj2") pod "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" (UID: "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf"). InnerVolumeSpecName "kube-api-access-bpqj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.402247 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util" (OuterVolumeSpecName: "util") pod "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" (UID: "0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.477349 4774 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.477382 4774 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.477394 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpqj2\" (UniqueName: \"kubernetes.io/projected/0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf-kube-api-access-bpqj2\") on node \"crc\" DevicePath \"\"" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.893331 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" event={"ID":"0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf","Type":"ContainerDied","Data":"bebbe86b3302f1fa1fecbb31ceb7d8a1d5f22707dcd1f8c84c904ae36c01dbaa"} Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.893389 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bebbe86b3302f1fa1fecbb31ceb7d8a1d5f22707dcd1f8c84c904ae36c01dbaa" Oct 01 13:52:31 crc kubenswrapper[4774]: I1001 13:52:31.893405 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.271422 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.272186 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.965101 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745"] Oct 01 13:52:37 crc kubenswrapper[4774]: E1001 13:52:37.965624 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="extract" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.965636 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="extract" Oct 01 13:52:37 crc kubenswrapper[4774]: E1001 13:52:37.965653 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="pull" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.965659 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="pull" Oct 01 13:52:37 crc kubenswrapper[4774]: E1001 13:52:37.965669 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="util" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.965675 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="util" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.965766 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf" containerName="extract" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.966360 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.969474 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dvsks" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.969492 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Oct 01 13:52:37 crc kubenswrapper[4774]: I1001 13:52:37.976062 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745"] Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.072828 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-webhook-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.072879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxbgk\" (UniqueName: \"kubernetes.io/projected/fe84b77c-3e6a-4244-8ef5-c6747459fabc-kube-api-access-hxbgk\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.073028 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-apiservice-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.174598 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-apiservice-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.174671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-webhook-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.174701 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxbgk\" (UniqueName: \"kubernetes.io/projected/fe84b77c-3e6a-4244-8ef5-c6747459fabc-kube-api-access-hxbgk\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.183379 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-webhook-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.184097 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fe84b77c-3e6a-4244-8ef5-c6747459fabc-apiservice-cert\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.203121 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxbgk\" (UniqueName: \"kubernetes.io/projected/fe84b77c-3e6a-4244-8ef5-c6747459fabc-kube-api-access-hxbgk\") pod \"keystone-operator-controller-manager-7d9d9bb4b5-fr745\" (UID: \"fe84b77c-3e6a-4244-8ef5-c6747459fabc\") " pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.288930 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.525018 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745"] Oct 01 13:52:38 crc kubenswrapper[4774]: W1001 13:52:38.526593 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe84b77c_3e6a_4244_8ef5_c6747459fabc.slice/crio-93527dcc18ea992690af55e6cb17f42d8dd9d494872e050ce90e9e6f95aa7151 WatchSource:0}: Error finding container 93527dcc18ea992690af55e6cb17f42d8dd9d494872e050ce90e9e6f95aa7151: Status 404 returned error can't find the container with id 93527dcc18ea992690af55e6cb17f42d8dd9d494872e050ce90e9e6f95aa7151 Oct 01 13:52:38 crc kubenswrapper[4774]: I1001 13:52:38.940510 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"93527dcc18ea992690af55e6cb17f42d8dd9d494872e050ce90e9e6f95aa7151"} Oct 01 13:52:41 crc kubenswrapper[4774]: I1001 13:52:41.966979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"3a7a736c4f823eab2910dec0cccbca7dff735f259bf49121d23ef355416c9e32"} Oct 01 13:52:42 crc kubenswrapper[4774]: I1001 13:52:42.974161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"280ab43d7f7cab7c575082971b801f42485be6db259c44e5e8a8a9085e8236b2"} Oct 01 13:52:42 crc kubenswrapper[4774]: I1001 13:52:42.975675 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:42 crc kubenswrapper[4774]: I1001 13:52:42.996021 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podStartSLOduration=2.414709839 podStartE2EDuration="5.99600549s" podCreationTimestamp="2025-10-01 13:52:37 +0000 UTC" firstStartedPulling="2025-10-01 13:52:38.529241096 +0000 UTC m=+930.418871703" lastFinishedPulling="2025-10-01 13:52:42.110536747 +0000 UTC m=+934.000167354" observedRunningTime="2025-10-01 13:52:42.995184188 +0000 UTC m=+934.884814815" watchObservedRunningTime="2025-10-01 13:52:42.99600549 +0000 UTC m=+934.885636087" Oct 01 13:52:48 crc kubenswrapper[4774]: I1001 13:52:48.295558 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.621705 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-clq2d"] Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.623237 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.637047 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-clq2d"] Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.662705 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnx6r\" (UniqueName: \"kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r\") pod \"keystone-db-create-clq2d\" (UID: \"a72d2d4c-889c-427a-b99f-f9607ceb47e3\") " pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.764822 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnx6r\" (UniqueName: \"kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r\") pod \"keystone-db-create-clq2d\" (UID: \"a72d2d4c-889c-427a-b99f-f9607ceb47e3\") " pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.790890 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnx6r\" (UniqueName: \"kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r\") pod \"keystone-db-create-clq2d\" (UID: \"a72d2d4c-889c-427a-b99f-f9607ceb47e3\") " pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:50 crc kubenswrapper[4774]: I1001 13:52:50.952393 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:51 crc kubenswrapper[4774]: I1001 13:52:51.283628 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-clq2d"] Oct 01 13:52:51 crc kubenswrapper[4774]: E1001 13:52:51.629647 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72d2d4c_889c_427a_b99f_f9607ceb47e3.slice/crio-conmon-89eeddf8d71cfe8f2e08615aadc8739b2680a4a2277361a81ebf51dd7ac025f8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda72d2d4c_889c_427a_b99f_f9607ceb47e3.slice/crio-89eeddf8d71cfe8f2e08615aadc8739b2680a4a2277361a81ebf51dd7ac025f8.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:52:52 crc kubenswrapper[4774]: I1001 13:52:52.046227 4774 generic.go:334] "Generic (PLEG): container finished" podID="a72d2d4c-889c-427a-b99f-f9607ceb47e3" containerID="89eeddf8d71cfe8f2e08615aadc8739b2680a4a2277361a81ebf51dd7ac025f8" exitCode=0 Oct 01 13:52:52 crc kubenswrapper[4774]: I1001 13:52:52.046386 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-clq2d" event={"ID":"a72d2d4c-889c-427a-b99f-f9607ceb47e3","Type":"ContainerDied","Data":"89eeddf8d71cfe8f2e08615aadc8739b2680a4a2277361a81ebf51dd7ac025f8"} Oct 01 13:52:52 crc kubenswrapper[4774]: I1001 13:52:52.047171 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-clq2d" event={"ID":"a72d2d4c-889c-427a-b99f-f9607ceb47e3","Type":"ContainerStarted","Data":"6a0d6f31387cf9fca850cbf1a12d1eb7a2ec14d25240390345f1293289ec26a3"} Oct 01 13:52:53 crc kubenswrapper[4774]: I1001 13:52:53.300320 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:53 crc kubenswrapper[4774]: I1001 13:52:53.413680 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnx6r\" (UniqueName: \"kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r\") pod \"a72d2d4c-889c-427a-b99f-f9607ceb47e3\" (UID: \"a72d2d4c-889c-427a-b99f-f9607ceb47e3\") " Oct 01 13:52:53 crc kubenswrapper[4774]: I1001 13:52:53.423196 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r" (OuterVolumeSpecName: "kube-api-access-lnx6r") pod "a72d2d4c-889c-427a-b99f-f9607ceb47e3" (UID: "a72d2d4c-889c-427a-b99f-f9607ceb47e3"). InnerVolumeSpecName "kube-api-access-lnx6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:52:53 crc kubenswrapper[4774]: I1001 13:52:53.515247 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnx6r\" (UniqueName: \"kubernetes.io/projected/a72d2d4c-889c-427a-b99f-f9607ceb47e3-kube-api-access-lnx6r\") on node \"crc\" DevicePath \"\"" Oct 01 13:52:54 crc kubenswrapper[4774]: I1001 13:52:54.061616 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-clq2d" event={"ID":"a72d2d4c-889c-427a-b99f-f9607ceb47e3","Type":"ContainerDied","Data":"6a0d6f31387cf9fca850cbf1a12d1eb7a2ec14d25240390345f1293289ec26a3"} Oct 01 13:52:54 crc kubenswrapper[4774]: I1001 13:52:54.061674 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a0d6f31387cf9fca850cbf1a12d1eb7a2ec14d25240390345f1293289ec26a3" Oct 01 13:52:54 crc kubenswrapper[4774]: I1001 13:52:54.061738 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-clq2d" Oct 01 13:52:55 crc kubenswrapper[4774]: I1001 13:52:55.072006 4774 generic.go:334] "Generic (PLEG): container finished" podID="5be2fd22-c494-44f3-889d-43561b4bfa34" containerID="284a931e330ed302556ecc71553de34f0dad2f54f5ec5401041f111741bcd191" exitCode=0 Oct 01 13:52:55 crc kubenswrapper[4774]: I1001 13:52:55.072382 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"5be2fd22-c494-44f3-889d-43561b4bfa34","Type":"ContainerDied","Data":"284a931e330ed302556ecc71553de34f0dad2f54f5ec5401041f111741bcd191"} Oct 01 13:52:56 crc kubenswrapper[4774]: I1001 13:52:56.084046 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/rabbitmq-server-0" event={"ID":"5be2fd22-c494-44f3-889d-43561b4bfa34","Type":"ContainerStarted","Data":"6024ee7ed38d90c24ab314144f8a20660ba50bad7a033aa1d584aa2c03222417"} Oct 01 13:52:56 crc kubenswrapper[4774]: I1001 13:52:56.084350 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:52:56 crc kubenswrapper[4774]: I1001 13:52:56.113955 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.543833207 podStartE2EDuration="45.113921583s" podCreationTimestamp="2025-10-01 13:52:11 +0000 UTC" firstStartedPulling="2025-10-01 13:52:13.13970994 +0000 UTC m=+905.029340567" lastFinishedPulling="2025-10-01 13:52:20.709798316 +0000 UTC m=+912.599428943" observedRunningTime="2025-10-01 13:52:56.112096434 +0000 UTC m=+948.001727091" watchObservedRunningTime="2025-10-01 13:52:56.113921583 +0000 UTC m=+948.003552220" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.530708 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-1594-account-create-vcc45"] Oct 01 13:53:00 crc kubenswrapper[4774]: E1001 13:53:00.531731 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d2d4c-889c-427a-b99f-f9607ceb47e3" containerName="mariadb-database-create" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.531763 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d2d4c-889c-427a-b99f-f9607ceb47e3" containerName="mariadb-database-create" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.532059 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72d2d4c-889c-427a-b99f-f9607ceb47e3" containerName="mariadb-database-create" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.533113 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.535920 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.538991 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1594-account-create-vcc45"] Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.624138 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qkf\" (UniqueName: \"kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf\") pod \"keystone-1594-account-create-vcc45\" (UID: \"47f35a62-5183-4425-b4ff-cb07e202032c\") " pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.726251 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qkf\" (UniqueName: \"kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf\") pod \"keystone-1594-account-create-vcc45\" (UID: \"47f35a62-5183-4425-b4ff-cb07e202032c\") " pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.760872 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qkf\" (UniqueName: \"kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf\") pod \"keystone-1594-account-create-vcc45\" (UID: \"47f35a62-5183-4425-b4ff-cb07e202032c\") " pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:00 crc kubenswrapper[4774]: I1001 13:53:00.857024 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:01 crc kubenswrapper[4774]: I1001 13:53:01.328545 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-1594-account-create-vcc45"] Oct 01 13:53:02 crc kubenswrapper[4774]: I1001 13:53:02.132224 4774 generic.go:334] "Generic (PLEG): container finished" podID="47f35a62-5183-4425-b4ff-cb07e202032c" containerID="f0bc8b6b98a64d3569d6745238a4f8d98bd01a2a024983efc9d4f996fdef984a" exitCode=0 Oct 01 13:53:02 crc kubenswrapper[4774]: I1001 13:53:02.132405 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" event={"ID":"47f35a62-5183-4425-b4ff-cb07e202032c","Type":"ContainerDied","Data":"f0bc8b6b98a64d3569d6745238a4f8d98bd01a2a024983efc9d4f996fdef984a"} Oct 01 13:53:02 crc kubenswrapper[4774]: I1001 13:53:02.132738 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" event={"ID":"47f35a62-5183-4425-b4ff-cb07e202032c","Type":"ContainerStarted","Data":"b1f7678eab4e7de2437530720d8e34d9153e53afd465503a9bf8549be4855953"} Oct 01 13:53:03 crc kubenswrapper[4774]: I1001 13:53:03.473580 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:03 crc kubenswrapper[4774]: I1001 13:53:03.564123 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6qkf\" (UniqueName: \"kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf\") pod \"47f35a62-5183-4425-b4ff-cb07e202032c\" (UID: \"47f35a62-5183-4425-b4ff-cb07e202032c\") " Oct 01 13:53:03 crc kubenswrapper[4774]: I1001 13:53:03.570089 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf" (OuterVolumeSpecName: "kube-api-access-q6qkf") pod "47f35a62-5183-4425-b4ff-cb07e202032c" (UID: "47f35a62-5183-4425-b4ff-cb07e202032c"). InnerVolumeSpecName "kube-api-access-q6qkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:53:03 crc kubenswrapper[4774]: I1001 13:53:03.666001 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6qkf\" (UniqueName: \"kubernetes.io/projected/47f35a62-5183-4425-b4ff-cb07e202032c-kube-api-access-q6qkf\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:04 crc kubenswrapper[4774]: I1001 13:53:04.151515 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" event={"ID":"47f35a62-5183-4425-b4ff-cb07e202032c","Type":"ContainerDied","Data":"b1f7678eab4e7de2437530720d8e34d9153e53afd465503a9bf8549be4855953"} Oct 01 13:53:04 crc kubenswrapper[4774]: I1001 13:53:04.151555 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1f7678eab4e7de2437530720d8e34d9153e53afd465503a9bf8549be4855953" Oct 01 13:53:04 crc kubenswrapper[4774]: I1001 13:53:04.151606 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-1594-account-create-vcc45" Oct 01 13:53:07 crc kubenswrapper[4774]: I1001 13:53:07.270570 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:53:07 crc kubenswrapper[4774]: I1001 13:53:07.271546 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:53:07 crc kubenswrapper[4774]: I1001 13:53:07.271637 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:53:07 crc kubenswrapper[4774]: I1001 13:53:07.272707 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:53:07 crc kubenswrapper[4774]: I1001 13:53:07.272864 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd" gracePeriod=600 Oct 01 13:53:08 crc kubenswrapper[4774]: I1001 13:53:08.197272 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd" exitCode=0 Oct 01 13:53:08 crc kubenswrapper[4774]: I1001 13:53:08.197312 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd"} Oct 01 13:53:08 crc kubenswrapper[4774]: I1001 13:53:08.197672 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126"} Oct 01 13:53:08 crc kubenswrapper[4774]: I1001 13:53:08.197719 4774 scope.go:117] "RemoveContainer" containerID="f5fe54fbf797d4b0be5b34d0ef2c73873c911635f7884f87ede82bc1e52a3917" Oct 01 13:53:12 crc kubenswrapper[4774]: I1001 13:53:12.676791 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/rabbitmq-server-0" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.286834 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-2kpnl"] Oct 01 13:53:13 crc kubenswrapper[4774]: E1001 13:53:13.287227 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f35a62-5183-4425-b4ff-cb07e202032c" containerName="mariadb-account-create" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.287255 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f35a62-5183-4425-b4ff-cb07e202032c" containerName="mariadb-account-create" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.287561 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f35a62-5183-4425-b4ff-cb07e202032c" containerName="mariadb-account-create" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.288215 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.292489 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.292682 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.292851 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.296956 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-v7xxn" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.301092 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-2kpnl"] Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.414179 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.414530 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txkdz\" (UniqueName: \"kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.516615 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.516713 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txkdz\" (UniqueName: \"kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.527196 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.542611 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txkdz\" (UniqueName: \"kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz\") pod \"keystone-db-sync-2kpnl\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.610877 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:13 crc kubenswrapper[4774]: I1001 13:53:13.903760 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-2kpnl"] Oct 01 13:53:14 crc kubenswrapper[4774]: I1001 13:53:14.256589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" event={"ID":"dbce09d0-de6d-4b0e-9589-3427e3fd79f9","Type":"ContainerStarted","Data":"0e6d1344adbdfdf834e72c4a9d0bf0a46890f25d82f9360333270b044537eb8d"} Oct 01 13:53:23 crc kubenswrapper[4774]: I1001 13:53:23.314995 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" event={"ID":"dbce09d0-de6d-4b0e-9589-3427e3fd79f9","Type":"ContainerStarted","Data":"66d3cb431714dcee9d8232f0d0b5a4dbf07b26c4f11b31f6ecc2fcee8b114555"} Oct 01 13:53:23 crc kubenswrapper[4774]: I1001 13:53:23.338658 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" podStartSLOduration=1.186502938 podStartE2EDuration="10.33863337s" podCreationTimestamp="2025-10-01 13:53:13 +0000 UTC" firstStartedPulling="2025-10-01 13:53:13.914707973 +0000 UTC m=+965.804338580" lastFinishedPulling="2025-10-01 13:53:23.066838375 +0000 UTC m=+974.956469012" observedRunningTime="2025-10-01 13:53:23.336632808 +0000 UTC m=+975.226263425" watchObservedRunningTime="2025-10-01 13:53:23.33863337 +0000 UTC m=+975.228264007" Oct 01 13:53:27 crc kubenswrapper[4774]: I1001 13:53:27.351130 4774 generic.go:334] "Generic (PLEG): container finished" podID="dbce09d0-de6d-4b0e-9589-3427e3fd79f9" containerID="66d3cb431714dcee9d8232f0d0b5a4dbf07b26c4f11b31f6ecc2fcee8b114555" exitCode=0 Oct 01 13:53:27 crc kubenswrapper[4774]: I1001 13:53:27.351250 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" event={"ID":"dbce09d0-de6d-4b0e-9589-3427e3fd79f9","Type":"ContainerDied","Data":"66d3cb431714dcee9d8232f0d0b5a4dbf07b26c4f11b31f6ecc2fcee8b114555"} Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.637742 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.732106 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txkdz\" (UniqueName: \"kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz\") pod \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.732189 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data\") pod \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\" (UID: \"dbce09d0-de6d-4b0e-9589-3427e3fd79f9\") " Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.740547 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz" (OuterVolumeSpecName: "kube-api-access-txkdz") pod "dbce09d0-de6d-4b0e-9589-3427e3fd79f9" (UID: "dbce09d0-de6d-4b0e-9589-3427e3fd79f9"). InnerVolumeSpecName "kube-api-access-txkdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.771592 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data" (OuterVolumeSpecName: "config-data") pod "dbce09d0-de6d-4b0e-9589-3427e3fd79f9" (UID: "dbce09d0-de6d-4b0e-9589-3427e3fd79f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.834387 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:28 crc kubenswrapper[4774]: I1001 13:53:28.834642 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txkdz\" (UniqueName: \"kubernetes.io/projected/dbce09d0-de6d-4b0e-9589-3427e3fd79f9-kube-api-access-txkdz\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.367060 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" event={"ID":"dbce09d0-de6d-4b0e-9589-3427e3fd79f9","Type":"ContainerDied","Data":"0e6d1344adbdfdf834e72c4a9d0bf0a46890f25d82f9360333270b044537eb8d"} Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.367478 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6d1344adbdfdf834e72c4a9d0bf0a46890f25d82f9360333270b044537eb8d" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.367127 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-2kpnl" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.614195 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-zpgtw"] Oct 01 13:53:29 crc kubenswrapper[4774]: E1001 13:53:29.614669 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbce09d0-de6d-4b0e-9589-3427e3fd79f9" containerName="keystone-db-sync" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.614698 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbce09d0-de6d-4b0e-9589-3427e3fd79f9" containerName="keystone-db-sync" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.614928 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbce09d0-de6d-4b0e-9589-3427e3fd79f9" containerName="keystone-db-sync" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.615625 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.617238 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.617509 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.618185 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.618382 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-v7xxn" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.622113 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-zpgtw"] Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.754984 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.755555 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.755650 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.755742 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8wzx\" (UniqueName: \"kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.755883 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.857307 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.857592 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.857685 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8wzx\" (UniqueName: \"kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.857811 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.858274 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.863118 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.863148 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.863150 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.864178 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.877244 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8wzx\" (UniqueName: \"kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx\") pod \"keystone-bootstrap-zpgtw\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:29 crc kubenswrapper[4774]: I1001 13:53:29.934528 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:30 crc kubenswrapper[4774]: I1001 13:53:30.188436 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-zpgtw"] Oct 01 13:53:30 crc kubenswrapper[4774]: W1001 13:53:30.200079 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf54aefd2_b811_4ffe_944e_7fe562e4fcdd.slice/crio-d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae WatchSource:0}: Error finding container d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae: Status 404 returned error can't find the container with id d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae Oct 01 13:53:30 crc kubenswrapper[4774]: I1001 13:53:30.376326 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" event={"ID":"f54aefd2-b811-4ffe-944e-7fe562e4fcdd","Type":"ContainerStarted","Data":"29b807d06e04e6131a92e43adae71ecd618b89a9e9a595136e8983e970622202"} Oct 01 13:53:30 crc kubenswrapper[4774]: I1001 13:53:30.376379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" event={"ID":"f54aefd2-b811-4ffe-944e-7fe562e4fcdd","Type":"ContainerStarted","Data":"d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae"} Oct 01 13:53:30 crc kubenswrapper[4774]: I1001 13:53:30.412746 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" podStartSLOduration=1.41269732 podStartE2EDuration="1.41269732s" podCreationTimestamp="2025-10-01 13:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:53:30.397585814 +0000 UTC m=+982.287216401" watchObservedRunningTime="2025-10-01 13:53:30.41269732 +0000 UTC m=+982.302328007" Oct 01 13:53:33 crc kubenswrapper[4774]: I1001 13:53:33.398543 4774 generic.go:334] "Generic (PLEG): container finished" podID="f54aefd2-b811-4ffe-944e-7fe562e4fcdd" containerID="29b807d06e04e6131a92e43adae71ecd618b89a9e9a595136e8983e970622202" exitCode=0 Oct 01 13:53:33 crc kubenswrapper[4774]: I1001 13:53:33.398608 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" event={"ID":"f54aefd2-b811-4ffe-944e-7fe562e4fcdd","Type":"ContainerDied","Data":"29b807d06e04e6131a92e43adae71ecd618b89a9e9a595136e8983e970622202"} Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.750798 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.834875 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys\") pod \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.834962 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8wzx\" (UniqueName: \"kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx\") pod \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.835027 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts\") pod \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.835128 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys\") pod \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.835196 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data\") pod \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\" (UID: \"f54aefd2-b811-4ffe-944e-7fe562e4fcdd\") " Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.841303 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f54aefd2-b811-4ffe-944e-7fe562e4fcdd" (UID: "f54aefd2-b811-4ffe-944e-7fe562e4fcdd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.841568 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts" (OuterVolumeSpecName: "scripts") pod "f54aefd2-b811-4ffe-944e-7fe562e4fcdd" (UID: "f54aefd2-b811-4ffe-944e-7fe562e4fcdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.842606 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx" (OuterVolumeSpecName: "kube-api-access-f8wzx") pod "f54aefd2-b811-4ffe-944e-7fe562e4fcdd" (UID: "f54aefd2-b811-4ffe-944e-7fe562e4fcdd"). InnerVolumeSpecName "kube-api-access-f8wzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.842762 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f54aefd2-b811-4ffe-944e-7fe562e4fcdd" (UID: "f54aefd2-b811-4ffe-944e-7fe562e4fcdd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.855606 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data" (OuterVolumeSpecName: "config-data") pod "f54aefd2-b811-4ffe-944e-7fe562e4fcdd" (UID: "f54aefd2-b811-4ffe-944e-7fe562e4fcdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.937777 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.937815 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8wzx\" (UniqueName: \"kubernetes.io/projected/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-kube-api-access-f8wzx\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.937828 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.937840 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:34 crc kubenswrapper[4774]: I1001 13:53:34.937863 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54aefd2-b811-4ffe-944e-7fe562e4fcdd-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.419061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" event={"ID":"f54aefd2-b811-4ffe-944e-7fe562e4fcdd","Type":"ContainerDied","Data":"d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae"} Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.419119 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31936f1fc69467d88a80330b34fd40fd2d8f895348d174bb9ddcac0052202ae" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.419186 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-zpgtw" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.614953 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:53:35 crc kubenswrapper[4774]: E1001 13:53:35.615237 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54aefd2-b811-4ffe-944e-7fe562e4fcdd" containerName="keystone-bootstrap" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.615251 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54aefd2-b811-4ffe-944e-7fe562e4fcdd" containerName="keystone-bootstrap" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.615406 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54aefd2-b811-4ffe-944e-7fe562e4fcdd" containerName="keystone-bootstrap" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.615903 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.621257 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-v7xxn" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.621313 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.622367 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.622938 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.636719 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.754444 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhb4b\" (UniqueName: \"kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.754557 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.754613 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.754786 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.754879 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.856333 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhb4b\" (UniqueName: \"kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.856386 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.856420 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.856484 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.856521 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.865698 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.865781 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.867902 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.870216 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.887073 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhb4b\" (UniqueName: \"kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b\") pod \"keystone-5679445c6f-vgdd2\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:35 crc kubenswrapper[4774]: I1001 13:53:35.941669 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:53:36 crc kubenswrapper[4774]: I1001 13:53:36.158294 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:53:36 crc kubenswrapper[4774]: I1001 13:53:36.429924 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" event={"ID":"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f","Type":"ContainerStarted","Data":"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c"} Oct 01 13:53:36 crc kubenswrapper[4774]: I1001 13:53:36.429982 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" event={"ID":"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f","Type":"ContainerStarted","Data":"63d1d449037988ceed55ccf4279d67eaa2c9a3a4ea522d366bf57d392ea998fb"} Oct 01 13:53:36 crc kubenswrapper[4774]: I1001 13:53:36.430088 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:54:07 crc kubenswrapper[4774]: I1001 13:54:07.364293 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:54:07 crc kubenswrapper[4774]: I1001 13:54:07.389619 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" podStartSLOduration=32.389597819 podStartE2EDuration="32.389597819s" podCreationTimestamp="2025-10-01 13:53:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:53:36.450622865 +0000 UTC m=+988.340253502" watchObservedRunningTime="2025-10-01 13:54:07.389597819 +0000 UTC m=+1019.279228446" Oct 01 13:54:09 crc kubenswrapper[4774]: E1001 13:54:09.097323 4774 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-5679445c6f-vgdd2_b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f/keystone-api/0.log" line={} Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.429107 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.430230 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.443963 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.514643 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.514686 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd9ch\" (UniqueName: \"kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.514735 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.514797 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.514897 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: E1001 13:54:09.547252 4774 log.go:32] "Failed when writing line to log file" err="http2: stream closed" path="/var/log/pods/keystone-kuttl-tests_keystone-5679445c6f-vgdd2_b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f/keystone-api/0.log" line={} Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.616495 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.616594 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd9ch\" (UniqueName: \"kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.616700 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.616770 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.616900 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.622612 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.624183 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.626090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.632770 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.648014 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd9ch\" (UniqueName: \"kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch\") pod \"keystone-797b484697-p47d7\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:09 crc kubenswrapper[4774]: I1001 13:54:09.752673 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.218800 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.759798 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" event={"ID":"116f64b7-2ad7-45ed-b587-b713cfeb3e58","Type":"ContainerStarted","Data":"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d"} Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.760305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" event={"ID":"116f64b7-2ad7-45ed-b587-b713cfeb3e58","Type":"ContainerStarted","Data":"cf879eb53a1ee44e91664e02dcd22d9dbf886e55c063ea81fd9dfbc3e033ec63"} Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.760403 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.794003 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" podStartSLOduration=1.793984772 podStartE2EDuration="1.793984772s" podCreationTimestamp="2025-10-01 13:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:54:10.79161702 +0000 UTC m=+1022.681247637" watchObservedRunningTime="2025-10-01 13:54:10.793984772 +0000 UTC m=+1022.683615369" Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.974228 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-2kpnl"] Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.984236 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-zpgtw"] Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.990105 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-zpgtw"] Oct 01 13:54:10 crc kubenswrapper[4774]: I1001 13:54:10.999510 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.003226 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-2kpnl"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.010243 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.010578 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" podUID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" containerName="keystone-api" containerID="cri-o://aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c" gracePeriod=30 Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.018586 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone1594-account-delete-m95dp"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.020588 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.038853 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl7k9\" (UniqueName: \"kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9\") pod \"keystone1594-account-delete-m95dp\" (UID: \"a9800108-6f64-4584-8d84-3cf5c548772b\") " pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.056506 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone1594-account-delete-m95dp"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.062579 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-clq2d"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.072419 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-clq2d"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.077513 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone1594-account-delete-m95dp"] Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.077978 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rl7k9], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" podUID="a9800108-6f64-4584-8d84-3cf5c548772b" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.081562 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-1594-account-create-vcc45"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.085258 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-1594-account-create-vcc45"] Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.140076 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl7k9\" (UniqueName: \"kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9\") pod \"keystone1594-account-delete-m95dp\" (UID: \"a9800108-6f64-4584-8d84-3cf5c548772b\") " pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.160965 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl7k9\" (UniqueName: \"kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9\") pod \"keystone1594-account-delete-m95dp\" (UID: \"a9800108-6f64-4584-8d84-3cf5c548772b\") " pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.767211 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.767783 4774 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="keystone-kuttl-tests/keystone-797b484697-p47d7" secret="" err="secret \"keystone-keystone-dockercfg-v7xxn\" not found" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.777756 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.950933 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl7k9\" (UniqueName: \"kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9\") pod \"a9800108-6f64-4584-8d84-3cf5c548772b\" (UID: \"a9800108-6f64-4584-8d84-3cf5c548772b\") " Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.951697 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.951760 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:12.451740881 +0000 UTC m=+1024.341371488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone-config-data" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.952831 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.952864 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:12.45285323 +0000 UTC m=+1024.342483837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.953259 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.953321 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:12.453305531 +0000 UTC m=+1024.342936138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.953387 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Oct 01 13:54:11 crc kubenswrapper[4774]: E1001 13:54:11.953420 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:12.453411444 +0000 UTC m=+1024.343042051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone-scripts" not found Oct 01 13:54:11 crc kubenswrapper[4774]: I1001 13:54:11.957677 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9" (OuterVolumeSpecName: "kube-api-access-rl7k9") pod "a9800108-6f64-4584-8d84-3cf5c548772b" (UID: "a9800108-6f64-4584-8d84-3cf5c548772b"). InnerVolumeSpecName "kube-api-access-rl7k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.053718 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl7k9\" (UniqueName: \"kubernetes.io/projected/a9800108-6f64-4584-8d84-3cf5c548772b-kube-api-access-rl7k9\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.458518 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.458871 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:13.458854916 +0000 UTC m=+1025.348485513 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "fernet-keys" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.458534 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone: secret "keystone" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.459137 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:13.459108583 +0000 UTC m=+1025.348739220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "credential-keys" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.458713 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-scripts: secret "keystone-scripts" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.459255 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:13.459208226 +0000 UTC m=+1025.348838823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone-scripts" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.458727 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/keystone-config-data: secret "keystone-config-data" not found Oct 01 13:54:12 crc kubenswrapper[4774]: E1001 13:54:12.459317 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data podName:116f64b7-2ad7-45ed-b587-b713cfeb3e58 nodeName:}" failed. No retries permitted until 2025-10-01 13:54:13.459305758 +0000 UTC m=+1025.348936355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data") pod "keystone-797b484697-p47d7" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58") : secret "keystone-config-data" not found Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.775345 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" podUID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" containerName="keystone-api" containerID="cri-o://2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d" gracePeriod=30 Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.777617 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone1594-account-delete-m95dp" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.847300 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone1594-account-delete-m95dp"] Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.855887 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone1594-account-delete-m95dp"] Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.892395 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f35a62-5183-4425-b4ff-cb07e202032c" path="/var/lib/kubelet/pods/47f35a62-5183-4425-b4ff-cb07e202032c/volumes" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.893883 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72d2d4c-889c-427a-b99f-f9607ceb47e3" path="/var/lib/kubelet/pods/a72d2d4c-889c-427a-b99f-f9607ceb47e3/volumes" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.894850 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9800108-6f64-4584-8d84-3cf5c548772b" path="/var/lib/kubelet/pods/a9800108-6f64-4584-8d84-3cf5c548772b/volumes" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.895720 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbce09d0-de6d-4b0e-9589-3427e3fd79f9" path="/var/lib/kubelet/pods/dbce09d0-de6d-4b0e-9589-3427e3fd79f9/volumes" Oct 01 13:54:12 crc kubenswrapper[4774]: I1001 13:54:12.897776 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54aefd2-b811-4ffe-944e-7fe562e4fcdd" path="/var/lib/kubelet/pods/f54aefd2-b811-4ffe-944e-7fe562e4fcdd/volumes" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.291977 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.473028 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd9ch\" (UniqueName: \"kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch\") pod \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.473100 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data\") pod \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.473196 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys\") pod \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.473268 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts\") pod \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.473303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys\") pod \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\" (UID: \"116f64b7-2ad7-45ed-b587-b713cfeb3e58\") " Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.479677 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch" (OuterVolumeSpecName: "kube-api-access-fd9ch") pod "116f64b7-2ad7-45ed-b587-b713cfeb3e58" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58"). InnerVolumeSpecName "kube-api-access-fd9ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.479772 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "116f64b7-2ad7-45ed-b587-b713cfeb3e58" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.480667 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts" (OuterVolumeSpecName: "scripts") pod "116f64b7-2ad7-45ed-b587-b713cfeb3e58" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.494264 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "116f64b7-2ad7-45ed-b587-b713cfeb3e58" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.502933 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data" (OuterVolumeSpecName: "config-data") pod "116f64b7-2ad7-45ed-b587-b713cfeb3e58" (UID: "116f64b7-2ad7-45ed-b587-b713cfeb3e58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.574851 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.574887 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.574899 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.574912 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd9ch\" (UniqueName: \"kubernetes.io/projected/116f64b7-2ad7-45ed-b587-b713cfeb3e58-kube-api-access-fd9ch\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.574923 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116f64b7-2ad7-45ed-b587-b713cfeb3e58-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.785160 4774 generic.go:334] "Generic (PLEG): container finished" podID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" containerID="2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d" exitCode=0 Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.785226 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" event={"ID":"116f64b7-2ad7-45ed-b587-b713cfeb3e58","Type":"ContainerDied","Data":"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d"} Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.785274 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.785299 4774 scope.go:117] "RemoveContainer" containerID="2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.785265 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-797b484697-p47d7" event={"ID":"116f64b7-2ad7-45ed-b587-b713cfeb3e58","Type":"ContainerDied","Data":"cf879eb53a1ee44e91664e02dcd22d9dbf886e55c063ea81fd9dfbc3e033ec63"} Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.814416 4774 scope.go:117] "RemoveContainer" containerID="2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d" Oct 01 13:54:13 crc kubenswrapper[4774]: E1001 13:54:13.814824 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d\": container with ID starting with 2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d not found: ID does not exist" containerID="2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.814864 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d"} err="failed to get container status \"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d\": rpc error: code = NotFound desc = could not find container \"2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d\": container with ID starting with 2b687e7f67f2d94efaa6625ee377d3c8344b0728899421dccc780b3e0de10e9d not found: ID does not exist" Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.827971 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:13 crc kubenswrapper[4774]: I1001 13:54:13.836402 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-797b484697-p47d7"] Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.556112 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.589642 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhb4b\" (UniqueName: \"kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b\") pod \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.589763 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data\") pod \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.589793 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts\") pod \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.589839 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys\") pod \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.589878 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys\") pod \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\" (UID: \"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f\") " Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.601358 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b" (OuterVolumeSpecName: "kube-api-access-nhb4b") pod "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" (UID: "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f"). InnerVolumeSpecName "kube-api-access-nhb4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.601390 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" (UID: "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.601448 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" (UID: "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.601476 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts" (OuterVolumeSpecName: "scripts") pod "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" (UID: "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.630348 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data" (OuterVolumeSpecName: "config-data") pod "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" (UID: "b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.690888 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.690917 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.690927 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhb4b\" (UniqueName: \"kubernetes.io/projected/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-kube-api-access-nhb4b\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.690938 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.690946 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.803600 4774 generic.go:334] "Generic (PLEG): container finished" podID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" containerID="aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c" exitCode=0 Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.803681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" event={"ID":"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f","Type":"ContainerDied","Data":"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c"} Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.803722 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" event={"ID":"b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f","Type":"ContainerDied","Data":"63d1d449037988ceed55ccf4279d67eaa2c9a3a4ea522d366bf57d392ea998fb"} Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.803749 4774 scope.go:117] "RemoveContainer" containerID="aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.803884 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-5679445c6f-vgdd2" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.856441 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.858080 4774 scope.go:117] "RemoveContainer" containerID="aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c" Oct 01 13:54:14 crc kubenswrapper[4774]: E1001 13:54:14.858768 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c\": container with ID starting with aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c not found: ID does not exist" containerID="aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.858817 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c"} err="failed to get container status \"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c\": rpc error: code = NotFound desc = could not find container \"aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c\": container with ID starting with aa1c7adae8ba3b3ffb0f3f24d2889f3076de1f136c2356834f898862c947eb4c not found: ID does not exist" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.866979 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-5679445c6f-vgdd2"] Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.900330 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" path="/var/lib/kubelet/pods/116f64b7-2ad7-45ed-b587-b713cfeb3e58/volumes" Oct 01 13:54:14 crc kubenswrapper[4774]: I1001 13:54:14.901382 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" path="/var/lib/kubelet/pods/b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f/volumes" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.289598 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-xqglv"] Oct 01 13:54:15 crc kubenswrapper[4774]: E1001 13:54:15.289840 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.289852 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: E1001 13:54:15.289869 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.289876 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.290025 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="116f64b7-2ad7-45ed-b587-b713cfeb3e58" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.290037 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b97dba20-b3f7-4e0e-99d9-21d5e2a6e42f" containerName="keystone-api" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.290431 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.301790 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-xqglv"] Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.303381 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87cf\" (UniqueName: \"kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf\") pod \"keystone-db-create-xqglv\" (UID: \"c413673c-89ab-4a88-96f8-85f8e76341b2\") " pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.406952 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n87cf\" (UniqueName: \"kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf\") pod \"keystone-db-create-xqglv\" (UID: \"c413673c-89ab-4a88-96f8-85f8e76341b2\") " pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.434236 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87cf\" (UniqueName: \"kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf\") pod \"keystone-db-create-xqglv\" (UID: \"c413673c-89ab-4a88-96f8-85f8e76341b2\") " pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.609286 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:15 crc kubenswrapper[4774]: I1001 13:54:15.891772 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-xqglv"] Oct 01 13:54:16 crc kubenswrapper[4774]: I1001 13:54:16.822722 4774 generic.go:334] "Generic (PLEG): container finished" podID="c413673c-89ab-4a88-96f8-85f8e76341b2" containerID="a8ec439546cd6808b3a400a08dcc184edb5034a7d18a9540a1acf3799c6cb7df" exitCode=0 Oct 01 13:54:16 crc kubenswrapper[4774]: I1001 13:54:16.822847 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-xqglv" event={"ID":"c413673c-89ab-4a88-96f8-85f8e76341b2","Type":"ContainerDied","Data":"a8ec439546cd6808b3a400a08dcc184edb5034a7d18a9540a1acf3799c6cb7df"} Oct 01 13:54:16 crc kubenswrapper[4774]: I1001 13:54:16.823053 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-xqglv" event={"ID":"c413673c-89ab-4a88-96f8-85f8e76341b2","Type":"ContainerStarted","Data":"df82fc27077aa0da5bc1d9a56608a04dd88962d5f36590ac9e7a1e865a315ae8"} Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.181851 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.352815 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n87cf\" (UniqueName: \"kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf\") pod \"c413673c-89ab-4a88-96f8-85f8e76341b2\" (UID: \"c413673c-89ab-4a88-96f8-85f8e76341b2\") " Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.358882 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf" (OuterVolumeSpecName: "kube-api-access-n87cf") pod "c413673c-89ab-4a88-96f8-85f8e76341b2" (UID: "c413673c-89ab-4a88-96f8-85f8e76341b2"). InnerVolumeSpecName "kube-api-access-n87cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.454754 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n87cf\" (UniqueName: \"kubernetes.io/projected/c413673c-89ab-4a88-96f8-85f8e76341b2-kube-api-access-n87cf\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.842740 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-xqglv" event={"ID":"c413673c-89ab-4a88-96f8-85f8e76341b2","Type":"ContainerDied","Data":"df82fc27077aa0da5bc1d9a56608a04dd88962d5f36590ac9e7a1e865a315ae8"} Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.842820 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df82fc27077aa0da5bc1d9a56608a04dd88962d5f36590ac9e7a1e865a315ae8" Oct 01 13:54:18 crc kubenswrapper[4774]: I1001 13:54:18.843229 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-xqglv" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.168616 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7b57-account-create-5stbk"] Oct 01 13:54:26 crc kubenswrapper[4774]: E1001 13:54:26.169335 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c413673c-89ab-4a88-96f8-85f8e76341b2" containerName="mariadb-database-create" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.169347 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="c413673c-89ab-4a88-96f8-85f8e76341b2" containerName="mariadb-database-create" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.169497 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="c413673c-89ab-4a88-96f8-85f8e76341b2" containerName="mariadb-database-create" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.169868 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.173220 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.176640 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7b57-account-create-5stbk"] Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.273528 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw2zb\" (UniqueName: \"kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb\") pod \"keystone-7b57-account-create-5stbk\" (UID: \"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b\") " pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.375396 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw2zb\" (UniqueName: \"kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb\") pod \"keystone-7b57-account-create-5stbk\" (UID: \"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b\") " pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.394340 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw2zb\" (UniqueName: \"kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb\") pod \"keystone-7b57-account-create-5stbk\" (UID: \"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b\") " pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.484674 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.773546 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7b57-account-create-5stbk"] Oct 01 13:54:26 crc kubenswrapper[4774]: I1001 13:54:26.906768 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" event={"ID":"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b","Type":"ContainerStarted","Data":"b198a9f307ed854305dd8d4a867308f3b2339eb5c1fde3f5a5f865a090f93f80"} Oct 01 13:54:27 crc kubenswrapper[4774]: I1001 13:54:27.920623 4774 generic.go:334] "Generic (PLEG): container finished" podID="f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" containerID="53ab217bd33ad948ae0545c8996270e8906b461cca5799d8511fb886d0e8bf04" exitCode=0 Oct 01 13:54:27 crc kubenswrapper[4774]: I1001 13:54:27.920737 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" event={"ID":"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b","Type":"ContainerDied","Data":"53ab217bd33ad948ae0545c8996270e8906b461cca5799d8511fb886d0e8bf04"} Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.282714 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.434050 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw2zb\" (UniqueName: \"kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb\") pod \"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b\" (UID: \"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b\") " Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.445779 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb" (OuterVolumeSpecName: "kube-api-access-zw2zb") pod "f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" (UID: "f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b"). InnerVolumeSpecName "kube-api-access-zw2zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.535898 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw2zb\" (UniqueName: \"kubernetes.io/projected/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b-kube-api-access-zw2zb\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.938868 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" event={"ID":"f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b","Type":"ContainerDied","Data":"b198a9f307ed854305dd8d4a867308f3b2339eb5c1fde3f5a5f865a090f93f80"} Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.938913 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b198a9f307ed854305dd8d4a867308f3b2339eb5c1fde3f5a5f865a090f93f80" Oct 01 13:54:29 crc kubenswrapper[4774]: I1001 13:54:29.938991 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7b57-account-create-5stbk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.759401 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-kmzvk"] Oct 01 13:54:31 crc kubenswrapper[4774]: E1001 13:54:31.759932 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" containerName="mariadb-account-create" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.759964 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" containerName="mariadb-account-create" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.760248 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" containerName="mariadb-account-create" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.761206 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.764448 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.764448 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.764506 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.766938 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-wqctq" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.775025 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-kmzvk"] Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.868543 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-977ns\" (UniqueName: \"kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.868622 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.970320 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.970800 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-977ns\" (UniqueName: \"kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.976789 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:31 crc kubenswrapper[4774]: I1001 13:54:31.994123 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-977ns\" (UniqueName: \"kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns\") pod \"keystone-db-sync-kmzvk\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:32 crc kubenswrapper[4774]: I1001 13:54:32.084691 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:32 crc kubenswrapper[4774]: I1001 13:54:32.360540 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-kmzvk"] Oct 01 13:54:32 crc kubenswrapper[4774]: I1001 13:54:32.965667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" event={"ID":"b74b3431-ffa9-4355-ae27-8a61274228e1","Type":"ContainerStarted","Data":"74a2ad2e5a34884c71e59f754d976f350d43dc2193a5a920302172f89b216d02"} Oct 01 13:54:32 crc kubenswrapper[4774]: I1001 13:54:32.966059 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" event={"ID":"b74b3431-ffa9-4355-ae27-8a61274228e1","Type":"ContainerStarted","Data":"3c710da4062a668c0c1716093ff043943c53a8d3eb06764ce110c8053696a3c6"} Oct 01 13:54:33 crc kubenswrapper[4774]: I1001 13:54:33.977635 4774 generic.go:334] "Generic (PLEG): container finished" podID="b74b3431-ffa9-4355-ae27-8a61274228e1" containerID="74a2ad2e5a34884c71e59f754d976f350d43dc2193a5a920302172f89b216d02" exitCode=0 Oct 01 13:54:33 crc kubenswrapper[4774]: I1001 13:54:33.977709 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" event={"ID":"b74b3431-ffa9-4355-ae27-8a61274228e1","Type":"ContainerDied","Data":"74a2ad2e5a34884c71e59f754d976f350d43dc2193a5a920302172f89b216d02"} Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.404392 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.527624 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data\") pod \"b74b3431-ffa9-4355-ae27-8a61274228e1\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.527812 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-977ns\" (UniqueName: \"kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns\") pod \"b74b3431-ffa9-4355-ae27-8a61274228e1\" (UID: \"b74b3431-ffa9-4355-ae27-8a61274228e1\") " Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.535847 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns" (OuterVolumeSpecName: "kube-api-access-977ns") pod "b74b3431-ffa9-4355-ae27-8a61274228e1" (UID: "b74b3431-ffa9-4355-ae27-8a61274228e1"). InnerVolumeSpecName "kube-api-access-977ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.583774 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data" (OuterVolumeSpecName: "config-data") pod "b74b3431-ffa9-4355-ae27-8a61274228e1" (UID: "b74b3431-ffa9-4355-ae27-8a61274228e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.630346 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b74b3431-ffa9-4355-ae27-8a61274228e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.630402 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-977ns\" (UniqueName: \"kubernetes.io/projected/b74b3431-ffa9-4355-ae27-8a61274228e1-kube-api-access-977ns\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.998901 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" event={"ID":"b74b3431-ffa9-4355-ae27-8a61274228e1","Type":"ContainerDied","Data":"3c710da4062a668c0c1716093ff043943c53a8d3eb06764ce110c8053696a3c6"} Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.998954 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c710da4062a668c0c1716093ff043943c53a8d3eb06764ce110c8053696a3c6" Oct 01 13:54:35 crc kubenswrapper[4774]: I1001 13:54:35.998997 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-kmzvk" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.194406 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-ktv6g"] Oct 01 13:54:36 crc kubenswrapper[4774]: E1001 13:54:36.194833 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b3431-ffa9-4355-ae27-8a61274228e1" containerName="keystone-db-sync" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.194855 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b3431-ffa9-4355-ae27-8a61274228e1" containerName="keystone-db-sync" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.195154 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74b3431-ffa9-4355-ae27-8a61274228e1" containerName="keystone-db-sync" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.195754 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.198433 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.198746 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.199167 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-wqctq" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.199488 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.211259 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-ktv6g"] Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.352083 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.352228 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.352263 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.352301 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s6vf\" (UniqueName: \"kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.352437 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.454192 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.454263 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.454318 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s6vf\" (UniqueName: \"kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.454365 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.454441 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.464919 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.465366 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.465960 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.468024 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.481906 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s6vf\" (UniqueName: \"kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf\") pod \"keystone-bootstrap-ktv6g\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:36 crc kubenswrapper[4774]: I1001 13:54:36.519139 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:37 crc kubenswrapper[4774]: I1001 13:54:37.072632 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-ktv6g"] Oct 01 13:54:37 crc kubenswrapper[4774]: W1001 13:54:37.082660 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod929ef946_4dd4_4648_9133_026bd90b9c8c.slice/crio-de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086 WatchSource:0}: Error finding container de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086: Status 404 returned error can't find the container with id de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086 Oct 01 13:54:38 crc kubenswrapper[4774]: I1001 13:54:38.020964 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" event={"ID":"929ef946-4dd4-4648-9133-026bd90b9c8c","Type":"ContainerStarted","Data":"5bd4dac455924a756e50e16687adb78935bb529ea0340011064951b3f37d5fe1"} Oct 01 13:54:38 crc kubenswrapper[4774]: I1001 13:54:38.021329 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" event={"ID":"929ef946-4dd4-4648-9133-026bd90b9c8c","Type":"ContainerStarted","Data":"de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086"} Oct 01 13:54:38 crc kubenswrapper[4774]: I1001 13:54:38.049537 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" podStartSLOduration=2.049512008 podStartE2EDuration="2.049512008s" podCreationTimestamp="2025-10-01 13:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:54:38.044170808 +0000 UTC m=+1049.933801435" watchObservedRunningTime="2025-10-01 13:54:38.049512008 +0000 UTC m=+1049.939142635" Oct 01 13:54:40 crc kubenswrapper[4774]: I1001 13:54:40.039032 4774 generic.go:334] "Generic (PLEG): container finished" podID="929ef946-4dd4-4648-9133-026bd90b9c8c" containerID="5bd4dac455924a756e50e16687adb78935bb529ea0340011064951b3f37d5fe1" exitCode=0 Oct 01 13:54:40 crc kubenswrapper[4774]: I1001 13:54:40.039146 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" event={"ID":"929ef946-4dd4-4648-9133-026bd90b9c8c","Type":"ContainerDied","Data":"5bd4dac455924a756e50e16687adb78935bb529ea0340011064951b3f37d5fe1"} Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.333918 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.433009 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys\") pod \"929ef946-4dd4-4648-9133-026bd90b9c8c\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.433076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s6vf\" (UniqueName: \"kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf\") pod \"929ef946-4dd4-4648-9133-026bd90b9c8c\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.433129 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts\") pod \"929ef946-4dd4-4648-9133-026bd90b9c8c\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.433171 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys\") pod \"929ef946-4dd4-4648-9133-026bd90b9c8c\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.433225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data\") pod \"929ef946-4dd4-4648-9133-026bd90b9c8c\" (UID: \"929ef946-4dd4-4648-9133-026bd90b9c8c\") " Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.438627 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts" (OuterVolumeSpecName: "scripts") pod "929ef946-4dd4-4648-9133-026bd90b9c8c" (UID: "929ef946-4dd4-4648-9133-026bd90b9c8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.438625 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf" (OuterVolumeSpecName: "kube-api-access-2s6vf") pod "929ef946-4dd4-4648-9133-026bd90b9c8c" (UID: "929ef946-4dd4-4648-9133-026bd90b9c8c"). InnerVolumeSpecName "kube-api-access-2s6vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.438712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "929ef946-4dd4-4648-9133-026bd90b9c8c" (UID: "929ef946-4dd4-4648-9133-026bd90b9c8c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.440140 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "929ef946-4dd4-4648-9133-026bd90b9c8c" (UID: "929ef946-4dd4-4648-9133-026bd90b9c8c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.453119 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data" (OuterVolumeSpecName: "config-data") pod "929ef946-4dd4-4648-9133-026bd90b9c8c" (UID: "929ef946-4dd4-4648-9133-026bd90b9c8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.535231 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.535295 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.535314 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s6vf\" (UniqueName: \"kubernetes.io/projected/929ef946-4dd4-4648-9133-026bd90b9c8c-kube-api-access-2s6vf\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.535333 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:41 crc kubenswrapper[4774]: I1001 13:54:41.535351 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/929ef946-4dd4-4648-9133-026bd90b9c8c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.059985 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="3a7a736c4f823eab2910dec0cccbca7dff735f259bf49121d23ef355416c9e32" exitCode=1 Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.060061 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"3a7a736c4f823eab2910dec0cccbca7dff735f259bf49121d23ef355416c9e32"} Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.060519 4774 scope.go:117] "RemoveContainer" containerID="3a7a736c4f823eab2910dec0cccbca7dff735f259bf49121d23ef355416c9e32" Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.065978 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" event={"ID":"929ef946-4dd4-4648-9133-026bd90b9c8c","Type":"ContainerDied","Data":"de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086"} Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.066042 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5154cb1a661d6f036bc9ca1c01015a2b055107c6e882b442a0a14fc35b4086" Oct 01 13:54:42 crc kubenswrapper[4774]: I1001 13:54:42.066107 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-ktv6g" Oct 01 13:54:43 crc kubenswrapper[4774]: I1001 13:54:43.077033 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536"} Oct 01 13:54:43 crc kubenswrapper[4774]: I1001 13:54:43.077962 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:54:48 crc kubenswrapper[4774]: I1001 13:54:48.300978 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.471781 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:54:59 crc kubenswrapper[4774]: E1001 13:54:59.472535 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="929ef946-4dd4-4648-9133-026bd90b9c8c" containerName="keystone-bootstrap" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.472548 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="929ef946-4dd4-4648-9133-026bd90b9c8c" containerName="keystone-bootstrap" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.472659 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="929ef946-4dd4-4648-9133-026bd90b9c8c" containerName="keystone-bootstrap" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.473070 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.474728 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.478178 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.478184 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-wqctq" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.478279 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.484216 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.601722 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.601810 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.601858 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dndks\" (UniqueName: \"kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.601894 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.602011 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.702904 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.702997 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.703026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dndks\" (UniqueName: \"kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.703049 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.703086 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.709249 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.709525 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.709658 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.709790 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.725844 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dndks\" (UniqueName: \"kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks\") pod \"keystone-6c94bdfb5f-gmp77\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:54:59 crc kubenswrapper[4774]: I1001 13:54:59.789723 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:55:00 crc kubenswrapper[4774]: I1001 13:55:00.311170 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:55:01 crc kubenswrapper[4774]: I1001 13:55:01.231264 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" event={"ID":"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c","Type":"ContainerStarted","Data":"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838"} Oct 01 13:55:01 crc kubenswrapper[4774]: I1001 13:55:01.231735 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:55:01 crc kubenswrapper[4774]: I1001 13:55:01.231759 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" event={"ID":"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c","Type":"ContainerStarted","Data":"47227ad2d391ccf5ce1c1d2dba02a85a33d486a93f96aac848c5ce92c9811444"} Oct 01 13:55:01 crc kubenswrapper[4774]: I1001 13:55:01.261271 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" podStartSLOduration=2.261245323 podStartE2EDuration="2.261245323s" podCreationTimestamp="2025-10-01 13:54:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:55:01.256631953 +0000 UTC m=+1073.146262610" watchObservedRunningTime="2025-10-01 13:55:01.261245323 +0000 UTC m=+1073.150875960" Oct 01 13:55:07 crc kubenswrapper[4774]: I1001 13:55:07.270720 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:55:07 crc kubenswrapper[4774]: I1001 13:55:07.271422 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:55:31 crc kubenswrapper[4774]: I1001 13:55:31.152402 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:55:37 crc kubenswrapper[4774]: I1001 13:55:37.271364 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:55:37 crc kubenswrapper[4774]: I1001 13:55:37.272108 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.773636 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-ktv6g"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.784529 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-ktv6g"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.789212 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-kmzvk"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.796958 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.797147 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" podUID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" containerName="keystone-api" containerID="cri-o://098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838" gracePeriod=30 Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.805492 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-kmzvk"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.846148 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone7b57-account-delete-d4ntr"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.847115 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.854895 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone7b57-account-delete-d4ntr"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.860863 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-xqglv"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.878217 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-xqglv"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.885524 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone7b57-account-delete-d4ntr"] Oct 01 13:55:47 crc kubenswrapper[4774]: E1001 13:55:47.885962 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-8dxnj], unattached volumes=[], failed to process volumes=[kube-api-access-8dxnj]: context canceled" pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" podUID="68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9" Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.899766 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7b57-account-create-5stbk"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.903557 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7b57-account-create-5stbk"] Oct 01 13:55:47 crc kubenswrapper[4774]: I1001 13:55:47.977485 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxnj\" (UniqueName: \"kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj\") pod \"keystone7b57-account-delete-d4ntr\" (UID: \"68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9\") " pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.079349 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxnj\" (UniqueName: \"kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj\") pod \"keystone7b57-account-delete-d4ntr\" (UID: \"68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9\") " pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.129858 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxnj\" (UniqueName: \"kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj\") pod \"keystone7b57-account-delete-d4ntr\" (UID: \"68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9\") " pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.639343 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.653919 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.790148 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxnj\" (UniqueName: \"kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj\") pod \"68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9\" (UID: \"68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9\") " Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.795765 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj" (OuterVolumeSpecName: "kube-api-access-8dxnj") pod "68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9" (UID: "68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9"). InnerVolumeSpecName "kube-api-access-8dxnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.884871 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="929ef946-4dd4-4648-9133-026bd90b9c8c" path="/var/lib/kubelet/pods/929ef946-4dd4-4648-9133-026bd90b9c8c/volumes" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.885918 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74b3431-ffa9-4355-ae27-8a61274228e1" path="/var/lib/kubelet/pods/b74b3431-ffa9-4355-ae27-8a61274228e1/volumes" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.886855 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c413673c-89ab-4a88-96f8-85f8e76341b2" path="/var/lib/kubelet/pods/c413673c-89ab-4a88-96f8-85f8e76341b2/volumes" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.887764 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b" path="/var/lib/kubelet/pods/f91f3ee4-7d9a-47d5-b08a-ed8b32dd568b/volumes" Oct 01 13:55:48 crc kubenswrapper[4774]: I1001 13:55:48.892485 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dxnj\" (UniqueName: \"kubernetes.io/projected/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9-kube-api-access-8dxnj\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:49 crc kubenswrapper[4774]: I1001 13:55:49.649504 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7b57-account-delete-d4ntr" Oct 01 13:55:49 crc kubenswrapper[4774]: I1001 13:55:49.699243 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone7b57-account-delete-d4ntr"] Oct 01 13:55:49 crc kubenswrapper[4774]: I1001 13:55:49.706315 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone7b57-account-delete-d4ntr"] Oct 01 13:55:50 crc kubenswrapper[4774]: I1001 13:55:50.879479 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9" path="/var/lib/kubelet/pods/68c9cc2f-dc1b-4539-90ff-1f7bd8031ea9/volumes" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.334209 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.437322 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts\") pod \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.437627 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data\") pod \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.437685 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dndks\" (UniqueName: \"kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks\") pod \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.437735 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys\") pod \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.437817 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys\") pod \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\" (UID: \"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c\") " Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.445052 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts" (OuterVolumeSpecName: "scripts") pod "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" (UID: "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.445432 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" (UID: "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.446682 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks" (OuterVolumeSpecName: "kube-api-access-dndks") pod "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" (UID: "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c"). InnerVolumeSpecName "kube-api-access-dndks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.456670 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" (UID: "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.473565 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data" (OuterVolumeSpecName: "config-data") pod "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" (UID: "4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.539712 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.539767 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.539785 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.539802 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dndks\" (UniqueName: \"kubernetes.io/projected/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-kube-api-access-dndks\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.539819 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.674174 4774 generic.go:334] "Generic (PLEG): container finished" podID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" containerID="098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838" exitCode=0 Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.674278 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" event={"ID":"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c","Type":"ContainerDied","Data":"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838"} Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.674324 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.674370 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77" event={"ID":"4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c","Type":"ContainerDied","Data":"47227ad2d391ccf5ce1c1d2dba02a85a33d486a93f96aac848c5ce92c9811444"} Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.674518 4774 scope.go:117] "RemoveContainer" containerID="098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.718282 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.727000 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6c94bdfb5f-gmp77"] Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.728585 4774 scope.go:117] "RemoveContainer" containerID="098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838" Oct 01 13:55:51 crc kubenswrapper[4774]: E1001 13:55:51.729183 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838\": container with ID starting with 098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838 not found: ID does not exist" containerID="098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838" Oct 01 13:55:51 crc kubenswrapper[4774]: I1001 13:55:51.729379 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838"} err="failed to get container status \"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838\": rpc error: code = NotFound desc = could not find container \"098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838\": container with ID starting with 098e22ae96aecb5c90323d92c5302fcbaacca1601ed78dce691ff5d70242c838 not found: ID does not exist" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.144829 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6pzll"] Oct 01 13:55:52 crc kubenswrapper[4774]: E1001 13:55:52.145731 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" containerName="keystone-api" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.145746 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" containerName="keystone-api" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.145853 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" containerName="keystone-api" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.146746 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.151714 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6pzll"] Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.254491 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkm8s\" (UniqueName: \"kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s\") pod \"keystone-db-create-6pzll\" (UID: \"f3e9e066-29cc-4cde-aa4f-30e95341ff25\") " pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.356301 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkm8s\" (UniqueName: \"kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s\") pod \"keystone-db-create-6pzll\" (UID: \"f3e9e066-29cc-4cde-aa4f-30e95341ff25\") " pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.378690 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkm8s\" (UniqueName: \"kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s\") pod \"keystone-db-create-6pzll\" (UID: \"f3e9e066-29cc-4cde-aa4f-30e95341ff25\") " pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.464403 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.882909 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c" path="/var/lib/kubelet/pods/4ea11b0f-0e8b-438d-84ec-e08ccf9bf88c/volumes" Oct 01 13:55:52 crc kubenswrapper[4774]: I1001 13:55:52.918633 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6pzll"] Oct 01 13:55:53 crc kubenswrapper[4774]: I1001 13:55:53.698793 4774 generic.go:334] "Generic (PLEG): container finished" podID="f3e9e066-29cc-4cde-aa4f-30e95341ff25" containerID="0c3dfde683efa4f010ac68fedec926e0d9ed8a48ae29cb0c17337512c3264761" exitCode=0 Oct 01 13:55:53 crc kubenswrapper[4774]: I1001 13:55:53.698887 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6pzll" event={"ID":"f3e9e066-29cc-4cde-aa4f-30e95341ff25","Type":"ContainerDied","Data":"0c3dfde683efa4f010ac68fedec926e0d9ed8a48ae29cb0c17337512c3264761"} Oct 01 13:55:53 crc kubenswrapper[4774]: I1001 13:55:53.699278 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6pzll" event={"ID":"f3e9e066-29cc-4cde-aa4f-30e95341ff25","Type":"ContainerStarted","Data":"b4ef472c058dfa2966cdb6b9de2fd3ab7b2f245a372128af5dfaecd5c515ac56"} Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.104956 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.209787 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkm8s\" (UniqueName: \"kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s\") pod \"f3e9e066-29cc-4cde-aa4f-30e95341ff25\" (UID: \"f3e9e066-29cc-4cde-aa4f-30e95341ff25\") " Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.217183 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s" (OuterVolumeSpecName: "kube-api-access-qkm8s") pod "f3e9e066-29cc-4cde-aa4f-30e95341ff25" (UID: "f3e9e066-29cc-4cde-aa4f-30e95341ff25"). InnerVolumeSpecName "kube-api-access-qkm8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.311959 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkm8s\" (UniqueName: \"kubernetes.io/projected/f3e9e066-29cc-4cde-aa4f-30e95341ff25-kube-api-access-qkm8s\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.721006 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-6pzll" event={"ID":"f3e9e066-29cc-4cde-aa4f-30e95341ff25","Type":"ContainerDied","Data":"b4ef472c058dfa2966cdb6b9de2fd3ab7b2f245a372128af5dfaecd5c515ac56"} Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.721071 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ef472c058dfa2966cdb6b9de2fd3ab7b2f245a372128af5dfaecd5c515ac56" Oct 01 13:55:55 crc kubenswrapper[4774]: I1001 13:55:55.721338 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-6pzll" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.983337 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-7389-account-create-b9gjc"] Oct 01 13:56:02 crc kubenswrapper[4774]: E1001 13:56:02.984075 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e9e066-29cc-4cde-aa4f-30e95341ff25" containerName="mariadb-database-create" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.984086 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e9e066-29cc-4cde-aa4f-30e95341ff25" containerName="mariadb-database-create" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.984197 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e9e066-29cc-4cde-aa4f-30e95341ff25" containerName="mariadb-database-create" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.984617 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.986806 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 01 13:56:02 crc kubenswrapper[4774]: I1001 13:56:02.997418 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7389-account-create-b9gjc"] Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.070539 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zvhz\" (UniqueName: \"kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz\") pod \"keystone-7389-account-create-b9gjc\" (UID: \"96b10d1d-decb-40f0-8f45-74c466d1d9be\") " pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.171481 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zvhz\" (UniqueName: \"kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz\") pod \"keystone-7389-account-create-b9gjc\" (UID: \"96b10d1d-decb-40f0-8f45-74c466d1d9be\") " pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.195215 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zvhz\" (UniqueName: \"kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz\") pod \"keystone-7389-account-create-b9gjc\" (UID: \"96b10d1d-decb-40f0-8f45-74c466d1d9be\") " pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.348325 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.595372 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-7389-account-create-b9gjc"] Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.790858 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" event={"ID":"96b10d1d-decb-40f0-8f45-74c466d1d9be","Type":"ContainerStarted","Data":"6a5c9da08026208f38109311cc54b8d327d886f4db61859c0654741821c2bc2c"} Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.790902 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" event={"ID":"96b10d1d-decb-40f0-8f45-74c466d1d9be","Type":"ContainerStarted","Data":"52264acd13a02acd9aa5567402219bdf90a279c2b8097bbdd7c329b4f343de08"} Oct 01 13:56:03 crc kubenswrapper[4774]: I1001 13:56:03.808800 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" podStartSLOduration=1.808779191 podStartE2EDuration="1.808779191s" podCreationTimestamp="2025-10-01 13:56:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:56:03.807827475 +0000 UTC m=+1135.697458092" watchObservedRunningTime="2025-10-01 13:56:03.808779191 +0000 UTC m=+1135.698409788" Oct 01 13:56:04 crc kubenswrapper[4774]: I1001 13:56:04.799788 4774 generic.go:334] "Generic (PLEG): container finished" podID="96b10d1d-decb-40f0-8f45-74c466d1d9be" containerID="6a5c9da08026208f38109311cc54b8d327d886f4db61859c0654741821c2bc2c" exitCode=0 Oct 01 13:56:04 crc kubenswrapper[4774]: I1001 13:56:04.799838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" event={"ID":"96b10d1d-decb-40f0-8f45-74c466d1d9be","Type":"ContainerDied","Data":"6a5c9da08026208f38109311cc54b8d327d886f4db61859c0654741821c2bc2c"} Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.095728 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.216776 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zvhz\" (UniqueName: \"kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz\") pod \"96b10d1d-decb-40f0-8f45-74c466d1d9be\" (UID: \"96b10d1d-decb-40f0-8f45-74c466d1d9be\") " Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.224525 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz" (OuterVolumeSpecName: "kube-api-access-9zvhz") pod "96b10d1d-decb-40f0-8f45-74c466d1d9be" (UID: "96b10d1d-decb-40f0-8f45-74c466d1d9be"). InnerVolumeSpecName "kube-api-access-9zvhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.318561 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zvhz\" (UniqueName: \"kubernetes.io/projected/96b10d1d-decb-40f0-8f45-74c466d1d9be-kube-api-access-9zvhz\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.816616 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" event={"ID":"96b10d1d-decb-40f0-8f45-74c466d1d9be","Type":"ContainerDied","Data":"52264acd13a02acd9aa5567402219bdf90a279c2b8097bbdd7c329b4f343de08"} Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.816661 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52264acd13a02acd9aa5567402219bdf90a279c2b8097bbdd7c329b4f343de08" Oct 01 13:56:06 crc kubenswrapper[4774]: I1001 13:56:06.816697 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-7389-account-create-b9gjc" Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.270410 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.270510 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.270567 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.271389 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.271486 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126" gracePeriod=600 Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.827119 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126" exitCode=0 Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.827161 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126"} Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.827488 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f"} Oct 01 13:56:07 crc kubenswrapper[4774]: I1001 13:56:07.827508 4774 scope.go:117] "RemoveContainer" containerID="47c8b3c7b9b44c1a4fad799db6db52fcadf5c6e425449337453c911dfbb7a1cd" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.553896 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rtw5q"] Oct 01 13:56:08 crc kubenswrapper[4774]: E1001 13:56:08.555222 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b10d1d-decb-40f0-8f45-74c466d1d9be" containerName="mariadb-account-create" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.555328 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b10d1d-decb-40f0-8f45-74c466d1d9be" containerName="mariadb-account-create" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.555568 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b10d1d-decb-40f0-8f45-74c466d1d9be" containerName="mariadb-account-create" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.556180 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.558743 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.558967 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.559135 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.559139 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.559243 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jg57g" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.575201 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rtw5q"] Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.660057 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.660347 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.660543 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffjvb\" (UniqueName: \"kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.764533 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.764600 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.764671 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffjvb\" (UniqueName: \"kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.770880 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.783590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.786554 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffjvb\" (UniqueName: \"kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb\") pod \"keystone-db-sync-rtw5q\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.873293 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jg57g" Oct 01 13:56:08 crc kubenswrapper[4774]: I1001 13:56:08.882506 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:09 crc kubenswrapper[4774]: I1001 13:56:09.151814 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rtw5q"] Oct 01 13:56:09 crc kubenswrapper[4774]: I1001 13:56:09.844044 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" event={"ID":"11507957-af39-4768-9071-cf2579357a21","Type":"ContainerStarted","Data":"28ee9994d431637ce0e99c802227d39ef9e343e7fe4a0181fc1b3fc31f7af1ae"} Oct 01 13:56:09 crc kubenswrapper[4774]: I1001 13:56:09.844430 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" event={"ID":"11507957-af39-4768-9071-cf2579357a21","Type":"ContainerStarted","Data":"156a84bfc1dfe1074de950735f611e2258062c12506e6f0dcf7a60de748918cb"} Oct 01 13:56:09 crc kubenswrapper[4774]: I1001 13:56:09.871345 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" podStartSLOduration=1.871318748 podStartE2EDuration="1.871318748s" podCreationTimestamp="2025-10-01 13:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:56:09.863233416 +0000 UTC m=+1141.752864023" watchObservedRunningTime="2025-10-01 13:56:09.871318748 +0000 UTC m=+1141.760949345" Oct 01 13:56:10 crc kubenswrapper[4774]: I1001 13:56:10.850860 4774 generic.go:334] "Generic (PLEG): container finished" podID="11507957-af39-4768-9071-cf2579357a21" containerID="28ee9994d431637ce0e99c802227d39ef9e343e7fe4a0181fc1b3fc31f7af1ae" exitCode=0 Oct 01 13:56:10 crc kubenswrapper[4774]: I1001 13:56:10.850895 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" event={"ID":"11507957-af39-4768-9071-cf2579357a21","Type":"ContainerDied","Data":"28ee9994d431637ce0e99c802227d39ef9e343e7fe4a0181fc1b3fc31f7af1ae"} Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.166691 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.318506 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle\") pod \"11507957-af39-4768-9071-cf2579357a21\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.318980 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data\") pod \"11507957-af39-4768-9071-cf2579357a21\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.319071 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffjvb\" (UniqueName: \"kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb\") pod \"11507957-af39-4768-9071-cf2579357a21\" (UID: \"11507957-af39-4768-9071-cf2579357a21\") " Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.327802 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb" (OuterVolumeSpecName: "kube-api-access-ffjvb") pod "11507957-af39-4768-9071-cf2579357a21" (UID: "11507957-af39-4768-9071-cf2579357a21"). InnerVolumeSpecName "kube-api-access-ffjvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.349687 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11507957-af39-4768-9071-cf2579357a21" (UID: "11507957-af39-4768-9071-cf2579357a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.372558 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data" (OuterVolumeSpecName: "config-data") pod "11507957-af39-4768-9071-cf2579357a21" (UID: "11507957-af39-4768-9071-cf2579357a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.420965 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.421001 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffjvb\" (UniqueName: \"kubernetes.io/projected/11507957-af39-4768-9071-cf2579357a21-kube-api-access-ffjvb\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.421013 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11507957-af39-4768-9071-cf2579357a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.868692 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" event={"ID":"11507957-af39-4768-9071-cf2579357a21","Type":"ContainerDied","Data":"156a84bfc1dfe1074de950735f611e2258062c12506e6f0dcf7a60de748918cb"} Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.868766 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="156a84bfc1dfe1074de950735f611e2258062c12506e6f0dcf7a60de748918cb" Oct 01 13:56:12 crc kubenswrapper[4774]: I1001 13:56:12.868782 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-rtw5q" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.364479 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-knflt"] Oct 01 13:56:13 crc kubenswrapper[4774]: E1001 13:56:13.365080 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11507957-af39-4768-9071-cf2579357a21" containerName="keystone-db-sync" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.365097 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="11507957-af39-4768-9071-cf2579357a21" containerName="keystone-db-sync" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.365265 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="11507957-af39-4768-9071-cf2579357a21" containerName="keystone-db-sync" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.365816 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.368480 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.368505 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.368820 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.369406 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.370090 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jg57g" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.380231 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-knflt"] Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548213 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548280 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548322 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548357 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548680 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kzs\" (UniqueName: \"kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.548768 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.650906 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kzs\" (UniqueName: \"kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.650992 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.651060 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.651093 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.651142 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.651194 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.657654 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.658377 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.658504 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.658527 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.658570 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.670278 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kzs\" (UniqueName: \"kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs\") pod \"keystone-bootstrap-knflt\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.684538 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:13 crc kubenswrapper[4774]: I1001 13:56:13.944614 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-knflt"] Oct 01 13:56:13 crc kubenswrapper[4774]: W1001 13:56:13.958155 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a983f37_57ca_459f_a6d7_aa70fe3bf2d8.slice/crio-1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c WatchSource:0}: Error finding container 1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c: Status 404 returned error can't find the container with id 1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c Oct 01 13:56:14 crc kubenswrapper[4774]: I1001 13:56:14.899301 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" event={"ID":"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8","Type":"ContainerStarted","Data":"dbfd9d7aa8103cf7ec614ec7c12818d1cad4498e1e67f0dfdf653effdbf4b2b7"} Oct 01 13:56:14 crc kubenswrapper[4774]: I1001 13:56:14.899914 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" event={"ID":"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8","Type":"ContainerStarted","Data":"1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c"} Oct 01 13:56:14 crc kubenswrapper[4774]: I1001 13:56:14.937858 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" podStartSLOduration=1.937832746 podStartE2EDuration="1.937832746s" podCreationTimestamp="2025-10-01 13:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:56:14.932744357 +0000 UTC m=+1146.822374994" watchObservedRunningTime="2025-10-01 13:56:14.937832746 +0000 UTC m=+1146.827463383" Oct 01 13:56:16 crc kubenswrapper[4774]: I1001 13:56:16.921111 4774 generic.go:334] "Generic (PLEG): container finished" podID="0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" containerID="dbfd9d7aa8103cf7ec614ec7c12818d1cad4498e1e67f0dfdf653effdbf4b2b7" exitCode=0 Oct 01 13:56:16 crc kubenswrapper[4774]: I1001 13:56:16.921238 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" event={"ID":"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8","Type":"ContainerDied","Data":"dbfd9d7aa8103cf7ec614ec7c12818d1cad4498e1e67f0dfdf653effdbf4b2b7"} Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.296276 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.437854 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.437919 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.437977 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.438010 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.438074 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4kzs\" (UniqueName: \"kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.438732 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts\") pod \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\" (UID: \"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8\") " Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.443404 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs" (OuterVolumeSpecName: "kube-api-access-z4kzs") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "kube-api-access-z4kzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.444283 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.444518 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts" (OuterVolumeSpecName: "scripts") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.452950 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.456425 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data" (OuterVolumeSpecName: "config-data") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.464288 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" (UID: "0a983f37-57ca-459f-a6d7-aa70fe3bf2d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540806 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540861 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540869 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540877 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540887 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4kzs\" (UniqueName: \"kubernetes.io/projected/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-kube-api-access-z4kzs\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.540898 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.941435 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" event={"ID":"0a983f37-57ca-459f-a6d7-aa70fe3bf2d8","Type":"ContainerDied","Data":"1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c"} Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.941574 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cdceb5f840ca7eea3efa81ff9eeccedaa61d2a8b503e693598e2b31d124638c" Oct 01 13:56:18 crc kubenswrapper[4774]: I1001 13:56:18.941525 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-knflt" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.166344 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:19 crc kubenswrapper[4774]: E1001 13:56:19.166676 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" containerName="keystone-bootstrap" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.166693 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" containerName="keystone-bootstrap" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.166844 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" containerName="keystone-bootstrap" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.167339 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.170549 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172125 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172162 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-internal-svc" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172210 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172252 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-jg57g" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172302 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"cert-keystone-public-svc" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.172260 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"combined-ca-bundle" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.191081 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353564 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353628 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353667 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353754 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353785 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353809 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353832 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.353865 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5mh\" (UniqueName: \"kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455702 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455762 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455801 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455841 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455878 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455904 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455929 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.455964 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5mh\" (UniqueName: \"kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.459421 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.459603 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.459948 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.460590 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.460736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.461788 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.463953 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.488540 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5mh\" (UniqueName: \"kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh\") pod \"keystone-6bc654f99f-4hms9\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:19 crc kubenswrapper[4774]: I1001 13:56:19.782523 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:20 crc kubenswrapper[4774]: I1001 13:56:20.274368 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:20 crc kubenswrapper[4774]: W1001 13:56:20.288867 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5093df82_e2ae_461a_8741_5c6221a736b1.slice/crio-d46b1bb1a7546ded2625488a858f6324c3d0703952f48ea2a577204c3f8d97c7 WatchSource:0}: Error finding container d46b1bb1a7546ded2625488a858f6324c3d0703952f48ea2a577204c3f8d97c7: Status 404 returned error can't find the container with id d46b1bb1a7546ded2625488a858f6324c3d0703952f48ea2a577204c3f8d97c7 Oct 01 13:56:20 crc kubenswrapper[4774]: I1001 13:56:20.963108 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" event={"ID":"5093df82-e2ae-461a-8741-5c6221a736b1","Type":"ContainerStarted","Data":"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113"} Oct 01 13:56:20 crc kubenswrapper[4774]: I1001 13:56:20.963693 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:20 crc kubenswrapper[4774]: I1001 13:56:20.963710 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" event={"ID":"5093df82-e2ae-461a-8741-5c6221a736b1","Type":"ContainerStarted","Data":"d46b1bb1a7546ded2625488a858f6324c3d0703952f48ea2a577204c3f8d97c7"} Oct 01 13:56:51 crc kubenswrapper[4774]: I1001 13:56:51.196865 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:51 crc kubenswrapper[4774]: I1001 13:56:51.232981 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" podStartSLOduration=32.23294811 podStartE2EDuration="32.23294811s" podCreationTimestamp="2025-10-01 13:56:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:56:20.985064191 +0000 UTC m=+1152.874694808" watchObservedRunningTime="2025-10-01 13:56:51.23294811 +0000 UTC m=+1183.122578747" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.452400 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rtw5q"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.467433 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-rtw5q"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.473374 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-knflt"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.476855 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-knflt"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.491778 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.491966 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" podUID="5093df82-e2ae-461a-8741-5c6221a736b1" containerName="keystone-api" containerID="cri-o://93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113" gracePeriod=30 Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.520482 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone7389-account-delete-4cnb8"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.521406 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.530848 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone7389-account-delete-4cnb8"] Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.593401 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptcn\" (UniqueName: \"kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn\") pod \"keystone7389-account-delete-4cnb8\" (UID: \"66216b53-15e1-4892-9762-d1c6cdd1c941\") " pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.694364 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptcn\" (UniqueName: \"kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn\") pod \"keystone7389-account-delete-4cnb8\" (UID: \"66216b53-15e1-4892-9762-d1c6cdd1c941\") " pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.711488 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptcn\" (UniqueName: \"kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn\") pod \"keystone7389-account-delete-4cnb8\" (UID: \"66216b53-15e1-4892-9762-d1c6cdd1c941\") " pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.839111 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.878793 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a983f37-57ca-459f-a6d7-aa70fe3bf2d8" path="/var/lib/kubelet/pods/0a983f37-57ca-459f-a6d7-aa70fe3bf2d8/volumes" Oct 01 13:56:52 crc kubenswrapper[4774]: I1001 13:56:52.879423 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11507957-af39-4768-9071-cf2579357a21" path="/var/lib/kubelet/pods/11507957-af39-4768-9071-cf2579357a21/volumes" Oct 01 13:56:53 crc kubenswrapper[4774]: I1001 13:56:53.318920 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone7389-account-delete-4cnb8"] Oct 01 13:56:54 crc kubenswrapper[4774]: I1001 13:56:54.310926 4774 generic.go:334] "Generic (PLEG): container finished" podID="66216b53-15e1-4892-9762-d1c6cdd1c941" containerID="665c6e5e8441c6a74bb84adfe7e1a6f3121c2d0340dd2fb9923c1364c7036fc1" exitCode=0 Oct 01 13:56:54 crc kubenswrapper[4774]: I1001 13:56:54.311810 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" event={"ID":"66216b53-15e1-4892-9762-d1c6cdd1c941","Type":"ContainerDied","Data":"665c6e5e8441c6a74bb84adfe7e1a6f3121c2d0340dd2fb9923c1364c7036fc1"} Oct 01 13:56:54 crc kubenswrapper[4774]: I1001 13:56:54.314316 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" event={"ID":"66216b53-15e1-4892-9762-d1c6cdd1c941","Type":"ContainerStarted","Data":"12b9ee57cc20a7cceeae02027009259859ac5918964d8b5ccffaf5c20ced46b9"} Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.777687 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.841892 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cptcn\" (UniqueName: \"kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn\") pod \"66216b53-15e1-4892-9762-d1c6cdd1c941\" (UID: \"66216b53-15e1-4892-9762-d1c6cdd1c941\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.847872 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn" (OuterVolumeSpecName: "kube-api-access-cptcn") pod "66216b53-15e1-4892-9762-d1c6cdd1c941" (UID: "66216b53-15e1-4892-9762-d1c6cdd1c941"). InnerVolumeSpecName "kube-api-access-cptcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.898974 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.943987 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944080 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944102 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944136 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5mh\" (UniqueName: \"kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944163 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944189 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.944953 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.945056 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle\") pod \"5093df82-e2ae-461a-8741-5c6221a736b1\" (UID: \"5093df82-e2ae-461a-8741-5c6221a736b1\") " Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.945346 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cptcn\" (UniqueName: \"kubernetes.io/projected/66216b53-15e1-4892-9762-d1c6cdd1c941-kube-api-access-cptcn\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.960915 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.960989 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.961601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh" (OuterVolumeSpecName: "kube-api-access-gs5mh") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "kube-api-access-gs5mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.961780 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts" (OuterVolumeSpecName: "scripts") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.968258 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.974777 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.979707 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data" (OuterVolumeSpecName: "config-data") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:55 crc kubenswrapper[4774]: I1001 13:56:55.980077 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5093df82-e2ae-461a-8741-5c6221a736b1" (UID: "5093df82-e2ae-461a-8741-5c6221a736b1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046407 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046429 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5mh\" (UniqueName: \"kubernetes.io/projected/5093df82-e2ae-461a-8741-5c6221a736b1-kube-api-access-gs5mh\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046439 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046449 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046487 4774 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046495 4774 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046503 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.046511 4774 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5093df82-e2ae-461a-8741-5c6221a736b1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.335423 4774 generic.go:334] "Generic (PLEG): container finished" podID="5093df82-e2ae-461a-8741-5c6221a736b1" containerID="93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113" exitCode=0 Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.335534 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.335546 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" event={"ID":"5093df82-e2ae-461a-8741-5c6221a736b1","Type":"ContainerDied","Data":"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113"} Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.336075 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-6bc654f99f-4hms9" event={"ID":"5093df82-e2ae-461a-8741-5c6221a736b1","Type":"ContainerDied","Data":"d46b1bb1a7546ded2625488a858f6324c3d0703952f48ea2a577204c3f8d97c7"} Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.336116 4774 scope.go:117] "RemoveContainer" containerID="93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.338554 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" event={"ID":"66216b53-15e1-4892-9762-d1c6cdd1c941","Type":"ContainerDied","Data":"12b9ee57cc20a7cceeae02027009259859ac5918964d8b5ccffaf5c20ced46b9"} Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.338617 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone7389-account-delete-4cnb8" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.338625 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b9ee57cc20a7cceeae02027009259859ac5918964d8b5ccffaf5c20ced46b9" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.371744 4774 scope.go:117] "RemoveContainer" containerID="93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113" Oct 01 13:56:56 crc kubenswrapper[4774]: E1001 13:56:56.372520 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113\": container with ID starting with 93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113 not found: ID does not exist" containerID="93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.372579 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113"} err="failed to get container status \"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113\": rpc error: code = NotFound desc = could not find container \"93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113\": container with ID starting with 93c90e47a573e4d3142f4945c35e6fb3d3158c19644489c888a0f95d70137113 not found: ID does not exist" Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.394693 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.401993 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-6bc654f99f-4hms9"] Oct 01 13:56:56 crc kubenswrapper[4774]: I1001 13:56:56.881226 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5093df82-e2ae-461a-8741-5c6221a736b1" path="/var/lib/kubelet/pods/5093df82-e2ae-461a-8741-5c6221a736b1/volumes" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.557087 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6pzll"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.570987 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-6pzll"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.583436 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone7389-account-delete-4cnb8"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.597289 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-7389-account-create-b9gjc"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.605041 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone7389-account-delete-4cnb8"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.612137 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-7389-account-create-b9gjc"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.705054 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vqzcm"] Oct 01 13:56:57 crc kubenswrapper[4774]: E1001 13:56:57.705809 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66216b53-15e1-4892-9762-d1c6cdd1c941" containerName="mariadb-account-delete" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.705848 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="66216b53-15e1-4892-9762-d1c6cdd1c941" containerName="mariadb-account-delete" Oct 01 13:56:57 crc kubenswrapper[4774]: E1001 13:56:57.705871 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5093df82-e2ae-461a-8741-5c6221a736b1" containerName="keystone-api" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.705881 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="5093df82-e2ae-461a-8741-5c6221a736b1" containerName="keystone-api" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.706282 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="5093df82-e2ae-461a-8741-5c6221a736b1" containerName="keystone-api" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.706303 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="66216b53-15e1-4892-9762-d1c6cdd1c941" containerName="mariadb-account-delete" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.707386 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.732196 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vqzcm"] Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.773893 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v49fn\" (UniqueName: \"kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn\") pod \"keystone-db-create-vqzcm\" (UID: \"564820e0-1c91-4d46-9a71-a9cf7f4b68f6\") " pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.874632 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v49fn\" (UniqueName: \"kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn\") pod \"keystone-db-create-vqzcm\" (UID: \"564820e0-1c91-4d46-9a71-a9cf7f4b68f6\") " pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:56:57 crc kubenswrapper[4774]: I1001 13:56:57.898401 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v49fn\" (UniqueName: \"kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn\") pod \"keystone-db-create-vqzcm\" (UID: \"564820e0-1c91-4d46-9a71-a9cf7f4b68f6\") " pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:56:58 crc kubenswrapper[4774]: I1001 13:56:58.049928 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:56:58 crc kubenswrapper[4774]: I1001 13:56:58.550932 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vqzcm"] Oct 01 13:56:58 crc kubenswrapper[4774]: I1001 13:56:58.884243 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66216b53-15e1-4892-9762-d1c6cdd1c941" path="/var/lib/kubelet/pods/66216b53-15e1-4892-9762-d1c6cdd1c941/volumes" Oct 01 13:56:58 crc kubenswrapper[4774]: I1001 13:56:58.886032 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b10d1d-decb-40f0-8f45-74c466d1d9be" path="/var/lib/kubelet/pods/96b10d1d-decb-40f0-8f45-74c466d1d9be/volumes" Oct 01 13:56:58 crc kubenswrapper[4774]: I1001 13:56:58.887034 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e9e066-29cc-4cde-aa4f-30e95341ff25" path="/var/lib/kubelet/pods/f3e9e066-29cc-4cde-aa4f-30e95341ff25/volumes" Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.369331 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536" exitCode=1 Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.369512 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536"} Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.369597 4774 scope.go:117] "RemoveContainer" containerID="3a7a736c4f823eab2910dec0cccbca7dff735f259bf49121d23ef355416c9e32" Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.370688 4774 scope.go:117] "RemoveContainer" containerID="ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536" Oct 01 13:56:59 crc kubenswrapper[4774]: E1001 13:56:59.371148 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.372145 4774 generic.go:334] "Generic (PLEG): container finished" podID="564820e0-1c91-4d46-9a71-a9cf7f4b68f6" containerID="388a166cb679ecb5604fb371659818bde230cd3e9580a6d71478ab3f557f73f6" exitCode=0 Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.372212 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" event={"ID":"564820e0-1c91-4d46-9a71-a9cf7f4b68f6","Type":"ContainerDied","Data":"388a166cb679ecb5604fb371659818bde230cd3e9580a6d71478ab3f557f73f6"} Oct 01 13:56:59 crc kubenswrapper[4774]: I1001 13:56:59.372291 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" event={"ID":"564820e0-1c91-4d46-9a71-a9cf7f4b68f6","Type":"ContainerStarted","Data":"1080e9c5e21aeaa90d105f570d222d76eeae99b97692182e2481452d08962976"} Oct 01 13:57:00 crc kubenswrapper[4774]: I1001 13:57:00.704836 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:57:00 crc kubenswrapper[4774]: I1001 13:57:00.814544 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v49fn\" (UniqueName: \"kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn\") pod \"564820e0-1c91-4d46-9a71-a9cf7f4b68f6\" (UID: \"564820e0-1c91-4d46-9a71-a9cf7f4b68f6\") " Oct 01 13:57:00 crc kubenswrapper[4774]: I1001 13:57:00.821627 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn" (OuterVolumeSpecName: "kube-api-access-v49fn") pod "564820e0-1c91-4d46-9a71-a9cf7f4b68f6" (UID: "564820e0-1c91-4d46-9a71-a9cf7f4b68f6"). InnerVolumeSpecName "kube-api-access-v49fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:57:00 crc kubenswrapper[4774]: I1001 13:57:00.916963 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v49fn\" (UniqueName: \"kubernetes.io/projected/564820e0-1c91-4d46-9a71-a9cf7f4b68f6-kube-api-access-v49fn\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:01 crc kubenswrapper[4774]: I1001 13:57:01.390979 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" event={"ID":"564820e0-1c91-4d46-9a71-a9cf7f4b68f6","Type":"ContainerDied","Data":"1080e9c5e21aeaa90d105f570d222d76eeae99b97692182e2481452d08962976"} Oct 01 13:57:01 crc kubenswrapper[4774]: I1001 13:57:01.391022 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1080e9c5e21aeaa90d105f570d222d76eeae99b97692182e2481452d08962976" Oct 01 13:57:01 crc kubenswrapper[4774]: I1001 13:57:01.391079 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-vqzcm" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.659341 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n"] Oct 01 13:57:07 crc kubenswrapper[4774]: E1001 13:57:07.660194 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564820e0-1c91-4d46-9a71-a9cf7f4b68f6" containerName="mariadb-database-create" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.660208 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="564820e0-1c91-4d46-9a71-a9cf7f4b68f6" containerName="mariadb-database-create" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.660371 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="564820e0-1c91-4d46-9a71-a9cf7f4b68f6" containerName="mariadb-database-create" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.660923 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.663874 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.672661 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n"] Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.836047 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2blk\" (UniqueName: \"kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk\") pod \"keystone-f1e9-account-create-4dg7n\" (UID: \"bf64d03f-0ff8-4021-9979-07d407624518\") " pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.937712 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2blk\" (UniqueName: \"kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk\") pod \"keystone-f1e9-account-create-4dg7n\" (UID: \"bf64d03f-0ff8-4021-9979-07d407624518\") " pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.974261 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2blk\" (UniqueName: \"kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk\") pod \"keystone-f1e9-account-create-4dg7n\" (UID: \"bf64d03f-0ff8-4021-9979-07d407624518\") " pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:07 crc kubenswrapper[4774]: I1001 13:57:07.988732 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:08 crc kubenswrapper[4774]: I1001 13:57:08.289190 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:57:08 crc kubenswrapper[4774]: I1001 13:57:08.289769 4774 scope.go:117] "RemoveContainer" containerID="ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536" Oct 01 13:57:08 crc kubenswrapper[4774]: E1001 13:57:08.290091 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 13:57:08 crc kubenswrapper[4774]: I1001 13:57:08.465311 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n"] Oct 01 13:57:09 crc kubenswrapper[4774]: I1001 13:57:09.459534 4774 generic.go:334] "Generic (PLEG): container finished" podID="bf64d03f-0ff8-4021-9979-07d407624518" containerID="f28457dd8a2f8ff9c9a8270b6b65bb2dea446bf883280976db1f4cc4a5dc61a9" exitCode=0 Oct 01 13:57:09 crc kubenswrapper[4774]: I1001 13:57:09.459695 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" event={"ID":"bf64d03f-0ff8-4021-9979-07d407624518","Type":"ContainerDied","Data":"f28457dd8a2f8ff9c9a8270b6b65bb2dea446bf883280976db1f4cc4a5dc61a9"} Oct 01 13:57:09 crc kubenswrapper[4774]: I1001 13:57:09.459958 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" event={"ID":"bf64d03f-0ff8-4021-9979-07d407624518","Type":"ContainerStarted","Data":"320837ec5001e767eae88614662efa70fb8e65b63397519353730f2c3dab136f"} Oct 01 13:57:10 crc kubenswrapper[4774]: I1001 13:57:10.756596 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:10 crc kubenswrapper[4774]: I1001 13:57:10.883876 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2blk\" (UniqueName: \"kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk\") pod \"bf64d03f-0ff8-4021-9979-07d407624518\" (UID: \"bf64d03f-0ff8-4021-9979-07d407624518\") " Oct 01 13:57:10 crc kubenswrapper[4774]: I1001 13:57:10.894572 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk" (OuterVolumeSpecName: "kube-api-access-h2blk") pod "bf64d03f-0ff8-4021-9979-07d407624518" (UID: "bf64d03f-0ff8-4021-9979-07d407624518"). InnerVolumeSpecName "kube-api-access-h2blk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:57:10 crc kubenswrapper[4774]: I1001 13:57:10.985576 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2blk\" (UniqueName: \"kubernetes.io/projected/bf64d03f-0ff8-4021-9979-07d407624518-kube-api-access-h2blk\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:11 crc kubenswrapper[4774]: I1001 13:57:11.476165 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" event={"ID":"bf64d03f-0ff8-4021-9979-07d407624518","Type":"ContainerDied","Data":"320837ec5001e767eae88614662efa70fb8e65b63397519353730f2c3dab136f"} Oct 01 13:57:11 crc kubenswrapper[4774]: I1001 13:57:11.476227 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320837ec5001e767eae88614662efa70fb8e65b63397519353730f2c3dab136f" Oct 01 13:57:11 crc kubenswrapper[4774]: I1001 13:57:11.476243 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n" Oct 01 13:57:18 crc kubenswrapper[4774]: I1001 13:57:18.289521 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:57:18 crc kubenswrapper[4774]: I1001 13:57:18.291247 4774 scope.go:117] "RemoveContainer" containerID="ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536" Oct 01 13:57:18 crc kubenswrapper[4774]: I1001 13:57:18.539545 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56"} Oct 01 13:57:18 crc kubenswrapper[4774]: I1001 13:57:18.540039 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:57:28 crc kubenswrapper[4774]: I1001 13:57:28.298374 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.250602 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-44vjk"] Oct 01 13:57:38 crc kubenswrapper[4774]: E1001 13:57:38.251306 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf64d03f-0ff8-4021-9979-07d407624518" containerName="mariadb-account-create" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.251318 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf64d03f-0ff8-4021-9979-07d407624518" containerName="mariadb-account-create" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.251427 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf64d03f-0ff8-4021-9979-07d407624518" containerName="mariadb-account-create" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.251854 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.254074 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.254094 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.254279 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fcqd4" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.257707 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.265164 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-44vjk"] Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.326972 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.327036 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g25tc\" (UniqueName: \"kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.428438 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.428585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g25tc\" (UniqueName: \"kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.436171 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.455415 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g25tc\" (UniqueName: \"kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc\") pod \"keystone-db-sync-44vjk\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.569557 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:38 crc kubenswrapper[4774]: I1001 13:57:38.906732 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-44vjk"] Oct 01 13:57:38 crc kubenswrapper[4774]: W1001 13:57:38.915426 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd262894_eab3_432c_bd57_8b43950fccab.slice/crio-b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab WatchSource:0}: Error finding container b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab: Status 404 returned error can't find the container with id b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab Oct 01 13:57:39 crc kubenswrapper[4774]: I1001 13:57:39.733316 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" event={"ID":"fd262894-eab3-432c-bd57-8b43950fccab","Type":"ContainerStarted","Data":"afed800a1b6c3c2802da4784906bd35b48a7e9e9c4346eabe3d908f6762949be"} Oct 01 13:57:39 crc kubenswrapper[4774]: I1001 13:57:39.733777 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" event={"ID":"fd262894-eab3-432c-bd57-8b43950fccab","Type":"ContainerStarted","Data":"b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab"} Oct 01 13:57:39 crc kubenswrapper[4774]: I1001 13:57:39.761825 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" podStartSLOduration=1.761800825 podStartE2EDuration="1.761800825s" podCreationTimestamp="2025-10-01 13:57:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:57:39.759760421 +0000 UTC m=+1231.649391048" watchObservedRunningTime="2025-10-01 13:57:39.761800825 +0000 UTC m=+1231.651431422" Oct 01 13:57:40 crc kubenswrapper[4774]: I1001 13:57:40.745725 4774 generic.go:334] "Generic (PLEG): container finished" podID="fd262894-eab3-432c-bd57-8b43950fccab" containerID="afed800a1b6c3c2802da4784906bd35b48a7e9e9c4346eabe3d908f6762949be" exitCode=0 Oct 01 13:57:40 crc kubenswrapper[4774]: I1001 13:57:40.745809 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" event={"ID":"fd262894-eab3-432c-bd57-8b43950fccab","Type":"ContainerDied","Data":"afed800a1b6c3c2802da4784906bd35b48a7e9e9c4346eabe3d908f6762949be"} Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.096154 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.108478 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g25tc\" (UniqueName: \"kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc\") pod \"fd262894-eab3-432c-bd57-8b43950fccab\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.108586 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data\") pod \"fd262894-eab3-432c-bd57-8b43950fccab\" (UID: \"fd262894-eab3-432c-bd57-8b43950fccab\") " Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.119086 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc" (OuterVolumeSpecName: "kube-api-access-g25tc") pod "fd262894-eab3-432c-bd57-8b43950fccab" (UID: "fd262894-eab3-432c-bd57-8b43950fccab"). InnerVolumeSpecName "kube-api-access-g25tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.149370 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data" (OuterVolumeSpecName: "config-data") pod "fd262894-eab3-432c-bd57-8b43950fccab" (UID: "fd262894-eab3-432c-bd57-8b43950fccab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.214179 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g25tc\" (UniqueName: \"kubernetes.io/projected/fd262894-eab3-432c-bd57-8b43950fccab-kube-api-access-g25tc\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.214206 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd262894-eab3-432c-bd57-8b43950fccab-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.762714 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" event={"ID":"fd262894-eab3-432c-bd57-8b43950fccab","Type":"ContainerDied","Data":"b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab"} Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.762774 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49643e1e43fa5decfb3ef57d8ff40ecc430d58dac8929400f4035603b2c19ab" Oct 01 13:57:42 crc kubenswrapper[4774]: I1001 13:57:42.763121 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-44vjk" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.310881 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r7jv9"] Oct 01 13:57:43 crc kubenswrapper[4774]: E1001 13:57:43.311325 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd262894-eab3-432c-bd57-8b43950fccab" containerName="keystone-db-sync" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.311336 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd262894-eab3-432c-bd57-8b43950fccab" containerName="keystone-db-sync" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.311466 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd262894-eab3-432c-bd57-8b43950fccab" containerName="keystone-db-sync" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.311863 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.315225 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.315293 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.315445 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.315701 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fcqd4" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.328394 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.328559 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.328612 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.328643 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.328687 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkgth\" (UniqueName: \"kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.342658 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r7jv9"] Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.432023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.432106 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.432192 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.432220 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.432891 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkgth\" (UniqueName: \"kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.437262 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.440529 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.443181 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.443628 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.452593 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkgth\" (UniqueName: \"kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth\") pod \"keystone-bootstrap-r7jv9\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:43 crc kubenswrapper[4774]: I1001 13:57:43.641152 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:44 crc kubenswrapper[4774]: I1001 13:57:44.106932 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r7jv9"] Oct 01 13:57:44 crc kubenswrapper[4774]: I1001 13:57:44.779443 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" event={"ID":"93d103ac-9295-4d73-82f5-6023b7b28c99","Type":"ContainerStarted","Data":"800992360e4ad4cb2b1652e58cdf8bf2e35adfe118b55abd740c79a2caf7e121"} Oct 01 13:57:44 crc kubenswrapper[4774]: I1001 13:57:44.779856 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" event={"ID":"93d103ac-9295-4d73-82f5-6023b7b28c99","Type":"ContainerStarted","Data":"6bcd17b3ef73389ac467b409f4236a7bd6f4e98a54d6b63c9bb616a2f0873bf4"} Oct 01 13:57:44 crc kubenswrapper[4774]: I1001 13:57:44.807387 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" podStartSLOduration=1.807357527 podStartE2EDuration="1.807357527s" podCreationTimestamp="2025-10-01 13:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:57:44.803730241 +0000 UTC m=+1236.693360868" watchObservedRunningTime="2025-10-01 13:57:44.807357527 +0000 UTC m=+1236.696988164" Oct 01 13:57:47 crc kubenswrapper[4774]: I1001 13:57:47.803496 4774 generic.go:334] "Generic (PLEG): container finished" podID="93d103ac-9295-4d73-82f5-6023b7b28c99" containerID="800992360e4ad4cb2b1652e58cdf8bf2e35adfe118b55abd740c79a2caf7e121" exitCode=0 Oct 01 13:57:47 crc kubenswrapper[4774]: I1001 13:57:47.803505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" event={"ID":"93d103ac-9295-4d73-82f5-6023b7b28c99","Type":"ContainerDied","Data":"800992360e4ad4cb2b1652e58cdf8bf2e35adfe118b55abd740c79a2caf7e121"} Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.206141 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.322588 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkgth\" (UniqueName: \"kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth\") pod \"93d103ac-9295-4d73-82f5-6023b7b28c99\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.322693 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts\") pod \"93d103ac-9295-4d73-82f5-6023b7b28c99\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.322768 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data\") pod \"93d103ac-9295-4d73-82f5-6023b7b28c99\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.322829 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys\") pod \"93d103ac-9295-4d73-82f5-6023b7b28c99\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.322877 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys\") pod \"93d103ac-9295-4d73-82f5-6023b7b28c99\" (UID: \"93d103ac-9295-4d73-82f5-6023b7b28c99\") " Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.331576 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts" (OuterVolumeSpecName: "scripts") pod "93d103ac-9295-4d73-82f5-6023b7b28c99" (UID: "93d103ac-9295-4d73-82f5-6023b7b28c99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.331689 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "93d103ac-9295-4d73-82f5-6023b7b28c99" (UID: "93d103ac-9295-4d73-82f5-6023b7b28c99"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.332601 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "93d103ac-9295-4d73-82f5-6023b7b28c99" (UID: "93d103ac-9295-4d73-82f5-6023b7b28c99"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.332765 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth" (OuterVolumeSpecName: "kube-api-access-tkgth") pod "93d103ac-9295-4d73-82f5-6023b7b28c99" (UID: "93d103ac-9295-4d73-82f5-6023b7b28c99"). InnerVolumeSpecName "kube-api-access-tkgth". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.357900 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data" (OuterVolumeSpecName: "config-data") pod "93d103ac-9295-4d73-82f5-6023b7b28c99" (UID: "93d103ac-9295-4d73-82f5-6023b7b28c99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.425036 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.425076 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.425088 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.425098 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/93d103ac-9295-4d73-82f5-6023b7b28c99-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.425108 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkgth\" (UniqueName: \"kubernetes.io/projected/93d103ac-9295-4d73-82f5-6023b7b28c99-kube-api-access-tkgth\") on node \"crc\" DevicePath \"\"" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.824210 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" event={"ID":"93d103ac-9295-4d73-82f5-6023b7b28c99","Type":"ContainerDied","Data":"6bcd17b3ef73389ac467b409f4236a7bd6f4e98a54d6b63c9bb616a2f0873bf4"} Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.824270 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bcd17b3ef73389ac467b409f4236a7bd6f4e98a54d6b63c9bb616a2f0873bf4" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.824333 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-r7jv9" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.936631 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:57:49 crc kubenswrapper[4774]: E1001 13:57:49.937356 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d103ac-9295-4d73-82f5-6023b7b28c99" containerName="keystone-bootstrap" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.937383 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d103ac-9295-4d73-82f5-6023b7b28c99" containerName="keystone-bootstrap" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.937569 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d103ac-9295-4d73-82f5-6023b7b28c99" containerName="keystone-bootstrap" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.938175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.940926 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.942292 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.944618 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-fcqd4" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.946039 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:57:49 crc kubenswrapper[4774]: I1001 13:57:49.955403 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.035486 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln2xd\" (UniqueName: \"kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.035585 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.035622 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.035657 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.035681 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.137341 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.137446 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.137545 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.137586 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.137703 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln2xd\" (UniqueName: \"kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.142028 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.142714 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.142789 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.143347 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.161804 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln2xd\" (UniqueName: \"kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd\") pod \"keystone-765bbc8bd7-9c46g\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.255763 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.508863 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:57:50 crc kubenswrapper[4774]: W1001 13:57:50.513616 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd69b0665_c419_4457_9636_da3472bc413a.slice/crio-30cf65d95d4e4d0a43ba2faede9cf82bee2d2e6e5e3f7150f88c22fca9ef3714 WatchSource:0}: Error finding container 30cf65d95d4e4d0a43ba2faede9cf82bee2d2e6e5e3f7150f88c22fca9ef3714: Status 404 returned error can't find the container with id 30cf65d95d4e4d0a43ba2faede9cf82bee2d2e6e5e3f7150f88c22fca9ef3714 Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.834856 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" event={"ID":"d69b0665-c419-4457-9636-da3472bc413a","Type":"ContainerStarted","Data":"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557"} Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.834904 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" event={"ID":"d69b0665-c419-4457-9636-da3472bc413a","Type":"ContainerStarted","Data":"30cf65d95d4e4d0a43ba2faede9cf82bee2d2e6e5e3f7150f88c22fca9ef3714"} Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.835055 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:57:50 crc kubenswrapper[4774]: I1001 13:57:50.870600 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" podStartSLOduration=1.870565988 podStartE2EDuration="1.870565988s" podCreationTimestamp="2025-10-01 13:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:57:50.862642607 +0000 UTC m=+1242.752273274" watchObservedRunningTime="2025-10-01 13:57:50.870565988 +0000 UTC m=+1242.760196655" Oct 01 13:58:07 crc kubenswrapper[4774]: I1001 13:58:07.270875 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:58:07 crc kubenswrapper[4774]: I1001 13:58:07.271669 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:58:21 crc kubenswrapper[4774]: I1001 13:58:21.699063 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.195902 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.197488 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.209360 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.217769 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.219313 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.238302 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366623 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xb5b\" (UniqueName: \"kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366704 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366742 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366766 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366811 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366845 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.366865 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkgs\" (UniqueName: \"kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.367129 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.367200 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.367314 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469430 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469563 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469599 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469661 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469710 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469746 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkgs\" (UniqueName: \"kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469846 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469910 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.469959 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.470026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xb5b\" (UniqueName: \"kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.477290 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.477413 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.481323 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.481549 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.481605 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.481961 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.482230 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.483107 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.507285 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xb5b\" (UniqueName: \"kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b\") pod \"keystone-765bbc8bd7-295gw\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.513389 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkgs\" (UniqueName: \"kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs\") pod \"keystone-765bbc8bd7-q5sc6\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.520175 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.540274 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:23 crc kubenswrapper[4774]: I1001 13:58:23.804569 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.063344 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:58:24 crc kubenswrapper[4774]: W1001 13:58:24.071050 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb32a5973_c2a5_48de_b58a_4072ef19735a.slice/crio-ff72b1f99edec4206c88532f3e85874e79dce1e6a4f944d452f43f3ec21d9cfb WatchSource:0}: Error finding container ff72b1f99edec4206c88532f3e85874e79dce1e6a4f944d452f43f3ec21d9cfb: Status 404 returned error can't find the container with id ff72b1f99edec4206c88532f3e85874e79dce1e6a4f944d452f43f3ec21d9cfb Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.161838 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" event={"ID":"0531feae-96b1-4efc-975d-efaa5dddcbd0","Type":"ContainerStarted","Data":"f041d95bdc63a42ef9c89dde4ad68b60114d6ab2b7efca9f9a47875517dd660c"} Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.161882 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" event={"ID":"0531feae-96b1-4efc-975d-efaa5dddcbd0","Type":"ContainerStarted","Data":"25a88e3efc123a2ea79dd70f776dba0042de0504b8857577941e9bd7301788da"} Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.162051 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.163047 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" event={"ID":"b32a5973-c2a5-48de-b58a-4072ef19735a","Type":"ContainerStarted","Data":"ff72b1f99edec4206c88532f3e85874e79dce1e6a4f944d452f43f3ec21d9cfb"} Oct 01 13:58:24 crc kubenswrapper[4774]: I1001 13:58:24.177689 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" podStartSLOduration=1.177648095 podStartE2EDuration="1.177648095s" podCreationTimestamp="2025-10-01 13:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:58:24.175114298 +0000 UTC m=+1276.064744895" watchObservedRunningTime="2025-10-01 13:58:24.177648095 +0000 UTC m=+1276.067278712" Oct 01 13:58:25 crc kubenswrapper[4774]: I1001 13:58:25.174269 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" event={"ID":"b32a5973-c2a5-48de-b58a-4072ef19735a","Type":"ContainerStarted","Data":"422c73ad0f982c2f902b8cf5eb2e8cc85613f8939880ea1d4b11ee04b64d9760"} Oct 01 13:58:25 crc kubenswrapper[4774]: I1001 13:58:25.175108 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:25 crc kubenswrapper[4774]: I1001 13:58:25.205668 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" podStartSLOduration=2.205632448 podStartE2EDuration="2.205632448s" podCreationTimestamp="2025-10-01 13:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:58:25.199354901 +0000 UTC m=+1277.088985528" watchObservedRunningTime="2025-10-01 13:58:25.205632448 +0000 UTC m=+1277.095263085" Oct 01 13:58:37 crc kubenswrapper[4774]: I1001 13:58:37.270627 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:58:37 crc kubenswrapper[4774]: I1001 13:58:37.271335 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:58:54 crc kubenswrapper[4774]: I1001 13:58:54.972871 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:54 crc kubenswrapper[4774]: I1001 13:58:54.986263 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:56 crc kubenswrapper[4774]: I1001 13:58:56.162783 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:58:56 crc kubenswrapper[4774]: I1001 13:58:56.162982 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" podUID="b32a5973-c2a5-48de-b58a-4072ef19735a" containerName="keystone-api" containerID="cri-o://422c73ad0f982c2f902b8cf5eb2e8cc85613f8939880ea1d4b11ee04b64d9760" gracePeriod=30 Oct 01 13:58:56 crc kubenswrapper[4774]: I1001 13:58:56.172510 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:58:56 crc kubenswrapper[4774]: I1001 13:58:56.172748 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" podUID="0531feae-96b1-4efc-975d-efaa5dddcbd0" containerName="keystone-api" containerID="cri-o://f041d95bdc63a42ef9c89dde4ad68b60114d6ab2b7efca9f9a47875517dd660c" gracePeriod=30 Oct 01 13:58:57 crc kubenswrapper[4774]: I1001 13:58:57.364768 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:58:57 crc kubenswrapper[4774]: I1001 13:58:57.365093 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" podUID="d69b0665-c419-4457-9636-da3472bc413a" containerName="keystone-api" containerID="cri-o://040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557" gracePeriod=30 Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.513622 4774 generic.go:334] "Generic (PLEG): container finished" podID="0531feae-96b1-4efc-975d-efaa5dddcbd0" containerID="f041d95bdc63a42ef9c89dde4ad68b60114d6ab2b7efca9f9a47875517dd660c" exitCode=0 Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.513726 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" event={"ID":"0531feae-96b1-4efc-975d-efaa5dddcbd0","Type":"ContainerDied","Data":"f041d95bdc63a42ef9c89dde4ad68b60114d6ab2b7efca9f9a47875517dd660c"} Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.518059 4774 generic.go:334] "Generic (PLEG): container finished" podID="b32a5973-c2a5-48de-b58a-4072ef19735a" containerID="422c73ad0f982c2f902b8cf5eb2e8cc85613f8939880ea1d4b11ee04b64d9760" exitCode=0 Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.518100 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" event={"ID":"b32a5973-c2a5-48de-b58a-4072ef19735a","Type":"ContainerDied","Data":"422c73ad0f982c2f902b8cf5eb2e8cc85613f8939880ea1d4b11ee04b64d9760"} Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.722969 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.731660 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.872973 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data\") pod \"0531feae-96b1-4efc-975d-efaa5dddcbd0\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873057 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys\") pod \"b32a5973-c2a5-48de-b58a-4072ef19735a\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873085 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbkgs\" (UniqueName: \"kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs\") pod \"0531feae-96b1-4efc-975d-efaa5dddcbd0\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873130 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts\") pod \"0531feae-96b1-4efc-975d-efaa5dddcbd0\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873164 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys\") pod \"0531feae-96b1-4efc-975d-efaa5dddcbd0\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873251 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data\") pod \"b32a5973-c2a5-48de-b58a-4072ef19735a\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873864 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xb5b\" (UniqueName: \"kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b\") pod \"b32a5973-c2a5-48de-b58a-4072ef19735a\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873900 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys\") pod \"b32a5973-c2a5-48de-b58a-4072ef19735a\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873928 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys\") pod \"0531feae-96b1-4efc-975d-efaa5dddcbd0\" (UID: \"0531feae-96b1-4efc-975d-efaa5dddcbd0\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.873959 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts\") pod \"b32a5973-c2a5-48de-b58a-4072ef19735a\" (UID: \"b32a5973-c2a5-48de-b58a-4072ef19735a\") " Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.878725 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b32a5973-c2a5-48de-b58a-4072ef19735a" (UID: "b32a5973-c2a5-48de-b58a-4072ef19735a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.878761 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts" (OuterVolumeSpecName: "scripts") pod "b32a5973-c2a5-48de-b58a-4072ef19735a" (UID: "b32a5973-c2a5-48de-b58a-4072ef19735a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.879892 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b" (OuterVolumeSpecName: "kube-api-access-9xb5b") pod "b32a5973-c2a5-48de-b58a-4072ef19735a" (UID: "b32a5973-c2a5-48de-b58a-4072ef19735a"). InnerVolumeSpecName "kube-api-access-9xb5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.879941 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs" (OuterVolumeSpecName: "kube-api-access-rbkgs") pod "0531feae-96b1-4efc-975d-efaa5dddcbd0" (UID: "0531feae-96b1-4efc-975d-efaa5dddcbd0"). InnerVolumeSpecName "kube-api-access-rbkgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.879962 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0531feae-96b1-4efc-975d-efaa5dddcbd0" (UID: "0531feae-96b1-4efc-975d-efaa5dddcbd0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.880585 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts" (OuterVolumeSpecName: "scripts") pod "0531feae-96b1-4efc-975d-efaa5dddcbd0" (UID: "0531feae-96b1-4efc-975d-efaa5dddcbd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.880757 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0531feae-96b1-4efc-975d-efaa5dddcbd0" (UID: "0531feae-96b1-4efc-975d-efaa5dddcbd0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.888094 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b32a5973-c2a5-48de-b58a-4072ef19735a" (UID: "b32a5973-c2a5-48de-b58a-4072ef19735a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.894968 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data" (OuterVolumeSpecName: "config-data") pod "0531feae-96b1-4efc-975d-efaa5dddcbd0" (UID: "0531feae-96b1-4efc-975d-efaa5dddcbd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.900667 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data" (OuterVolumeSpecName: "config-data") pod "b32a5973-c2a5-48de-b58a-4072ef19735a" (UID: "b32a5973-c2a5-48de-b58a-4072ef19735a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975606 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975661 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975685 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975702 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975722 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xb5b\" (UniqueName: \"kubernetes.io/projected/b32a5973-c2a5-48de-b58a-4072ef19735a-kube-api-access-9xb5b\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975739 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975777 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975802 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0531feae-96b1-4efc-975d-efaa5dddcbd0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975824 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b32a5973-c2a5-48de-b58a-4072ef19735a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:58:59 crc kubenswrapper[4774]: I1001 13:58:59.975850 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbkgs\" (UniqueName: \"kubernetes.io/projected/0531feae-96b1-4efc-975d-efaa5dddcbd0-kube-api-access-rbkgs\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.525857 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" event={"ID":"0531feae-96b1-4efc-975d-efaa5dddcbd0","Type":"ContainerDied","Data":"25a88e3efc123a2ea79dd70f776dba0042de0504b8857577941e9bd7301788da"} Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.525904 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.525929 4774 scope.go:117] "RemoveContainer" containerID="f041d95bdc63a42ef9c89dde4ad68b60114d6ab2b7efca9f9a47875517dd660c" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.526963 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" event={"ID":"b32a5973-c2a5-48de-b58a-4072ef19735a","Type":"ContainerDied","Data":"ff72b1f99edec4206c88532f3e85874e79dce1e6a4f944d452f43f3ec21d9cfb"} Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.527034 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-295gw" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.551414 4774 scope.go:117] "RemoveContainer" containerID="422c73ad0f982c2f902b8cf5eb2e8cc85613f8939880ea1d4b11ee04b64d9760" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.641746 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.655521 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-q5sc6"] Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.671568 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.678020 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-295gw"] Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.879500 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0531feae-96b1-4efc-975d-efaa5dddcbd0" path="/var/lib/kubelet/pods/0531feae-96b1-4efc-975d-efaa5dddcbd0/volumes" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.880132 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b32a5973-c2a5-48de-b58a-4072ef19735a" path="/var/lib/kubelet/pods/b32a5973-c2a5-48de-b58a-4072ef19735a/volumes" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.914541 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.990992 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln2xd\" (UniqueName: \"kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd\") pod \"d69b0665-c419-4457-9636-da3472bc413a\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.991045 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys\") pod \"d69b0665-c419-4457-9636-da3472bc413a\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.991131 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data\") pod \"d69b0665-c419-4457-9636-da3472bc413a\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.991182 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts\") pod \"d69b0665-c419-4457-9636-da3472bc413a\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.991225 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys\") pod \"d69b0665-c419-4457-9636-da3472bc413a\" (UID: \"d69b0665-c419-4457-9636-da3472bc413a\") " Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.997137 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts" (OuterVolumeSpecName: "scripts") pod "d69b0665-c419-4457-9636-da3472bc413a" (UID: "d69b0665-c419-4457-9636-da3472bc413a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.997295 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd" (OuterVolumeSpecName: "kube-api-access-ln2xd") pod "d69b0665-c419-4457-9636-da3472bc413a" (UID: "d69b0665-c419-4457-9636-da3472bc413a"). InnerVolumeSpecName "kube-api-access-ln2xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.999385 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d69b0665-c419-4457-9636-da3472bc413a" (UID: "d69b0665-c419-4457-9636-da3472bc413a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:00 crc kubenswrapper[4774]: I1001 13:59:00.999663 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d69b0665-c419-4457-9636-da3472bc413a" (UID: "d69b0665-c419-4457-9636-da3472bc413a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.023400 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data" (OuterVolumeSpecName: "config-data") pod "d69b0665-c419-4457-9636-da3472bc413a" (UID: "d69b0665-c419-4457-9636-da3472bc413a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.093013 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.093063 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.093086 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln2xd\" (UniqueName: \"kubernetes.io/projected/d69b0665-c419-4457-9636-da3472bc413a-kube-api-access-ln2xd\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.093103 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.093123 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d69b0665-c419-4457-9636-da3472bc413a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.539398 4774 generic.go:334] "Generic (PLEG): container finished" podID="d69b0665-c419-4457-9636-da3472bc413a" containerID="040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557" exitCode=0 Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.539528 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.539554 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" event={"ID":"d69b0665-c419-4457-9636-da3472bc413a","Type":"ContainerDied","Data":"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557"} Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.539632 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-765bbc8bd7-9c46g" event={"ID":"d69b0665-c419-4457-9636-da3472bc413a","Type":"ContainerDied","Data":"30cf65d95d4e4d0a43ba2faede9cf82bee2d2e6e5e3f7150f88c22fca9ef3714"} Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.539664 4774 scope.go:117] "RemoveContainer" containerID="040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.580023 4774 scope.go:117] "RemoveContainer" containerID="040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557" Oct 01 13:59:01 crc kubenswrapper[4774]: E1001 13:59:01.580791 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557\": container with ID starting with 040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557 not found: ID does not exist" containerID="040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.581133 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557"} err="failed to get container status \"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557\": rpc error: code = NotFound desc = could not find container \"040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557\": container with ID starting with 040b041e643c3d3b3680b398c769ba7608b91b3184817c5fff6e489ac9e6b557 not found: ID does not exist" Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.602034 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:59:01 crc kubenswrapper[4774]: I1001 13:59:01.611004 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-765bbc8bd7-9c46g"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.567553 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r7jv9"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.580101 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-r7jv9"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.585469 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-44vjk"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.592154 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-44vjk"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617133 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk"] Oct 01 13:59:02 crc kubenswrapper[4774]: E1001 13:59:02.617497 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d69b0665-c419-4457-9636-da3472bc413a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617525 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69b0665-c419-4457-9636-da3472bc413a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: E1001 13:59:02.617559 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0531feae-96b1-4efc-975d-efaa5dddcbd0" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617571 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="0531feae-96b1-4efc-975d-efaa5dddcbd0" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: E1001 13:59:02.617598 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b32a5973-c2a5-48de-b58a-4072ef19735a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617608 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b32a5973-c2a5-48de-b58a-4072ef19735a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617784 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d69b0665-c419-4457-9636-da3472bc413a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617803 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="0531feae-96b1-4efc-975d-efaa5dddcbd0" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.617819 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b32a5973-c2a5-48de-b58a-4072ef19735a" containerName="keystone-api" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.618493 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.625696 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk"] Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.718986 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6tt\" (UniqueName: \"kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt\") pod \"keystonef1e9-account-delete-9tzmk\" (UID: \"22ae8a0f-9047-4e3b-bc9f-c023349ea08c\") " pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.820914 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6tt\" (UniqueName: \"kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt\") pod \"keystonef1e9-account-delete-9tzmk\" (UID: \"22ae8a0f-9047-4e3b-bc9f-c023349ea08c\") " pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.840872 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6tt\" (UniqueName: \"kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt\") pod \"keystonef1e9-account-delete-9tzmk\" (UID: \"22ae8a0f-9047-4e3b-bc9f-c023349ea08c\") " pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.881589 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d103ac-9295-4d73-82f5-6023b7b28c99" path="/var/lib/kubelet/pods/93d103ac-9295-4d73-82f5-6023b7b28c99/volumes" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.882932 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69b0665-c419-4457-9636-da3472bc413a" path="/var/lib/kubelet/pods/d69b0665-c419-4457-9636-da3472bc413a/volumes" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.884284 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd262894-eab3-432c-bd57-8b43950fccab" path="/var/lib/kubelet/pods/fd262894-eab3-432c-bd57-8b43950fccab/volumes" Oct 01 13:59:02 crc kubenswrapper[4774]: I1001 13:59:02.936412 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:03 crc kubenswrapper[4774]: I1001 13:59:03.199575 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk"] Oct 01 13:59:03 crc kubenswrapper[4774]: I1001 13:59:03.572625 4774 generic.go:334] "Generic (PLEG): container finished" podID="22ae8a0f-9047-4e3b-bc9f-c023349ea08c" containerID="98f1d0e8be02556c92f44319b70c5e887e8674971422459c461b2fa18abcd67a" exitCode=0 Oct 01 13:59:03 crc kubenswrapper[4774]: I1001 13:59:03.572657 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" event={"ID":"22ae8a0f-9047-4e3b-bc9f-c023349ea08c","Type":"ContainerDied","Data":"98f1d0e8be02556c92f44319b70c5e887e8674971422459c461b2fa18abcd67a"} Oct 01 13:59:03 crc kubenswrapper[4774]: I1001 13:59:03.572681 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" event={"ID":"22ae8a0f-9047-4e3b-bc9f-c023349ea08c","Type":"ContainerStarted","Data":"a46cc0907a69a29d5668de3302b07f7a98eea11e115b5c49faeaa3f4a0519501"} Oct 01 13:59:04 crc kubenswrapper[4774]: I1001 13:59:04.894477 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:04 crc kubenswrapper[4774]: I1001 13:59:04.965490 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6tt\" (UniqueName: \"kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt\") pod \"22ae8a0f-9047-4e3b-bc9f-c023349ea08c\" (UID: \"22ae8a0f-9047-4e3b-bc9f-c023349ea08c\") " Oct 01 13:59:04 crc kubenswrapper[4774]: I1001 13:59:04.973726 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt" (OuterVolumeSpecName: "kube-api-access-2m6tt") pod "22ae8a0f-9047-4e3b-bc9f-c023349ea08c" (UID: "22ae8a0f-9047-4e3b-bc9f-c023349ea08c"). InnerVolumeSpecName "kube-api-access-2m6tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:05 crc kubenswrapper[4774]: I1001 13:59:05.067572 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m6tt\" (UniqueName: \"kubernetes.io/projected/22ae8a0f-9047-4e3b-bc9f-c023349ea08c-kube-api-access-2m6tt\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:05 crc kubenswrapper[4774]: I1001 13:59:05.590217 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" event={"ID":"22ae8a0f-9047-4e3b-bc9f-c023349ea08c","Type":"ContainerDied","Data":"a46cc0907a69a29d5668de3302b07f7a98eea11e115b5c49faeaa3f4a0519501"} Oct 01 13:59:05 crc kubenswrapper[4774]: I1001 13:59:05.590633 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46cc0907a69a29d5668de3302b07f7a98eea11e115b5c49faeaa3f4a0519501" Oct 01 13:59:05 crc kubenswrapper[4774]: I1001 13:59:05.590293 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.271765 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.271863 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.271931 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.272800 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.272884 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f" gracePeriod=600 Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.611202 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f" exitCode=0 Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.611414 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f"} Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.611887 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b"} Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.611936 4774 scope.go:117] "RemoveContainer" containerID="4e1986b6aa4082eaba4db55db350683c8c3d94491cc5ccf5f341ed6826a24126" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.641270 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vqzcm"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.646264 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-vqzcm"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.657331 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.661579 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.664832 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonef1e9-account-delete-9tzmk"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.668529 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-f1e9-account-create-4dg7n"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.775244 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cp2bd"] Oct 01 13:59:07 crc kubenswrapper[4774]: E1001 13:59:07.775528 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ae8a0f-9047-4e3b-bc9f-c023349ea08c" containerName="mariadb-account-delete" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.775544 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ae8a0f-9047-4e3b-bc9f-c023349ea08c" containerName="mariadb-account-delete" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.775682 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ae8a0f-9047-4e3b-bc9f-c023349ea08c" containerName="mariadb-account-delete" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.776135 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.787796 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cp2bd"] Oct 01 13:59:07 crc kubenswrapper[4774]: I1001 13:59:07.909544 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd4dt\" (UniqueName: \"kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt\") pod \"keystone-db-create-cp2bd\" (UID: \"31e9c8de-7c38-46cc-93e4-7dd8008eaed7\") " pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.010587 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd4dt\" (UniqueName: \"kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt\") pod \"keystone-db-create-cp2bd\" (UID: \"31e9c8de-7c38-46cc-93e4-7dd8008eaed7\") " pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.033511 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd4dt\" (UniqueName: \"kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt\") pod \"keystone-db-create-cp2bd\" (UID: \"31e9c8de-7c38-46cc-93e4-7dd8008eaed7\") " pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.092750 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.613368 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cp2bd"] Oct 01 13:59:08 crc kubenswrapper[4774]: W1001 13:59:08.624316 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e9c8de_7c38_46cc_93e4_7dd8008eaed7.slice/crio-c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c WatchSource:0}: Error finding container c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c: Status 404 returned error can't find the container with id c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.888329 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ae8a0f-9047-4e3b-bc9f-c023349ea08c" path="/var/lib/kubelet/pods/22ae8a0f-9047-4e3b-bc9f-c023349ea08c/volumes" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.889246 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564820e0-1c91-4d46-9a71-a9cf7f4b68f6" path="/var/lib/kubelet/pods/564820e0-1c91-4d46-9a71-a9cf7f4b68f6/volumes" Oct 01 13:59:08 crc kubenswrapper[4774]: I1001 13:59:08.890332 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf64d03f-0ff8-4021-9979-07d407624518" path="/var/lib/kubelet/pods/bf64d03f-0ff8-4021-9979-07d407624518/volumes" Oct 01 13:59:09 crc kubenswrapper[4774]: I1001 13:59:09.482247 4774 scope.go:117] "RemoveContainer" containerID="f0bc8b6b98a64d3569d6745238a4f8d98bd01a2a024983efc9d4f996fdef984a" Oct 01 13:59:09 crc kubenswrapper[4774]: I1001 13:59:09.501884 4774 scope.go:117] "RemoveContainer" containerID="89eeddf8d71cfe8f2e08615aadc8739b2680a4a2277361a81ebf51dd7ac025f8" Oct 01 13:59:09 crc kubenswrapper[4774]: I1001 13:59:09.627999 4774 generic.go:334] "Generic (PLEG): container finished" podID="31e9c8de-7c38-46cc-93e4-7dd8008eaed7" containerID="7ab8ec0a4a3e190d237dd706662b9d147cbfafece8b6790f9c7768ad527bebcd" exitCode=0 Oct 01 13:59:09 crc kubenswrapper[4774]: I1001 13:59:09.628038 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" event={"ID":"31e9c8de-7c38-46cc-93e4-7dd8008eaed7","Type":"ContainerDied","Data":"7ab8ec0a4a3e190d237dd706662b9d147cbfafece8b6790f9c7768ad527bebcd"} Oct 01 13:59:09 crc kubenswrapper[4774]: I1001 13:59:09.628063 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" event={"ID":"31e9c8de-7c38-46cc-93e4-7dd8008eaed7","Type":"ContainerStarted","Data":"c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c"} Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.006483 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.053169 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd4dt\" (UniqueName: \"kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt\") pod \"31e9c8de-7c38-46cc-93e4-7dd8008eaed7\" (UID: \"31e9c8de-7c38-46cc-93e4-7dd8008eaed7\") " Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.084365 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt" (OuterVolumeSpecName: "kube-api-access-fd4dt") pod "31e9c8de-7c38-46cc-93e4-7dd8008eaed7" (UID: "31e9c8de-7c38-46cc-93e4-7dd8008eaed7"). InnerVolumeSpecName "kube-api-access-fd4dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.154943 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd4dt\" (UniqueName: \"kubernetes.io/projected/31e9c8de-7c38-46cc-93e4-7dd8008eaed7-kube-api-access-fd4dt\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.664216 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" event={"ID":"31e9c8de-7c38-46cc-93e4-7dd8008eaed7","Type":"ContainerDied","Data":"c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c"} Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.664614 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00a9ae21d27fc6ce694bc12e955e385d2f8210606c7a284c7b8ef29f9af179c" Oct 01 13:59:11 crc kubenswrapper[4774]: I1001 13:59:11.664813 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-create-cp2bd" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.815874 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j"] Oct 01 13:59:17 crc kubenswrapper[4774]: E1001 13:59:17.816998 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e9c8de-7c38-46cc-93e4-7dd8008eaed7" containerName="mariadb-database-create" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.817025 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e9c8de-7c38-46cc-93e4-7dd8008eaed7" containerName="mariadb-database-create" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.817231 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e9c8de-7c38-46cc-93e4-7dd8008eaed7" containerName="mariadb-database-create" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.817921 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.819967 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-db-secret" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.832780 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j"] Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.855651 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls4gn\" (UniqueName: \"kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn\") pod \"keystone-cbe7-account-create-rkg6j\" (UID: \"82a15ed2-6de1-46b5-9827-cdc086c1286f\") " pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.957848 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls4gn\" (UniqueName: \"kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn\") pod \"keystone-cbe7-account-create-rkg6j\" (UID: \"82a15ed2-6de1-46b5-9827-cdc086c1286f\") " pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:17 crc kubenswrapper[4774]: I1001 13:59:17.989939 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls4gn\" (UniqueName: \"kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn\") pod \"keystone-cbe7-account-create-rkg6j\" (UID: \"82a15ed2-6de1-46b5-9827-cdc086c1286f\") " pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:18 crc kubenswrapper[4774]: I1001 13:59:18.151586 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:18 crc kubenswrapper[4774]: I1001 13:59:18.697597 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j"] Oct 01 13:59:18 crc kubenswrapper[4774]: I1001 13:59:18.728008 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" event={"ID":"82a15ed2-6de1-46b5-9827-cdc086c1286f","Type":"ContainerStarted","Data":"9bb54ada682005454d9edbe1af9748f13d1272d8420560a70cfd48715304001c"} Oct 01 13:59:19 crc kubenswrapper[4774]: I1001 13:59:19.738775 4774 generic.go:334] "Generic (PLEG): container finished" podID="82a15ed2-6de1-46b5-9827-cdc086c1286f" containerID="bfe56871649e1fe80d595d13c1de8aea42efc1993a9d539fecbe2dcc8165493e" exitCode=0 Oct 01 13:59:19 crc kubenswrapper[4774]: I1001 13:59:19.738867 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" event={"ID":"82a15ed2-6de1-46b5-9827-cdc086c1286f","Type":"ContainerDied","Data":"bfe56871649e1fe80d595d13c1de8aea42efc1993a9d539fecbe2dcc8165493e"} Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.132108 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.212286 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls4gn\" (UniqueName: \"kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn\") pod \"82a15ed2-6de1-46b5-9827-cdc086c1286f\" (UID: \"82a15ed2-6de1-46b5-9827-cdc086c1286f\") " Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.221409 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn" (OuterVolumeSpecName: "kube-api-access-ls4gn") pod "82a15ed2-6de1-46b5-9827-cdc086c1286f" (UID: "82a15ed2-6de1-46b5-9827-cdc086c1286f"). InnerVolumeSpecName "kube-api-access-ls4gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.313591 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls4gn\" (UniqueName: \"kubernetes.io/projected/82a15ed2-6de1-46b5-9827-cdc086c1286f-kube-api-access-ls4gn\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.758530 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" event={"ID":"82a15ed2-6de1-46b5-9827-cdc086c1286f","Type":"ContainerDied","Data":"9bb54ada682005454d9edbe1af9748f13d1272d8420560a70cfd48715304001c"} Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.758586 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb54ada682005454d9edbe1af9748f13d1272d8420560a70cfd48715304001c" Oct 01 13:59:21 crc kubenswrapper[4774]: I1001 13:59:21.758631 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.319022 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-nsvjs"] Oct 01 13:59:23 crc kubenswrapper[4774]: E1001 13:59:23.319841 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a15ed2-6de1-46b5-9827-cdc086c1286f" containerName="mariadb-account-create" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.319869 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a15ed2-6de1-46b5-9827-cdc086c1286f" containerName="mariadb-account-create" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.320114 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a15ed2-6de1-46b5-9827-cdc086c1286f" containerName="mariadb-account-create" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.320777 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.325354 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.325505 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.325533 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.326377 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vtrlv" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.338223 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-nsvjs"] Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.347261 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.347380 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48fq6\" (UniqueName: \"kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.448388 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.448548 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48fq6\" (UniqueName: \"kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.458065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.478210 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48fq6\" (UniqueName: \"kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6\") pod \"keystone-db-sync-nsvjs\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:23 crc kubenswrapper[4774]: I1001 13:59:23.647293 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:24 crc kubenswrapper[4774]: I1001 13:59:24.164389 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-nsvjs"] Oct 01 13:59:24 crc kubenswrapper[4774]: W1001 13:59:24.175169 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c06ab62_1df7_4af5_9ebd_6e74545adff8.slice/crio-83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813 WatchSource:0}: Error finding container 83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813: Status 404 returned error can't find the container with id 83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813 Oct 01 13:59:24 crc kubenswrapper[4774]: I1001 13:59:24.787416 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" event={"ID":"2c06ab62-1df7-4af5-9ebd-6e74545adff8","Type":"ContainerStarted","Data":"ad5a25121cb41282a5ee5c003cdb8b6486fa6ef864d9e86a35db1b7e11cd6b87"} Oct 01 13:59:24 crc kubenswrapper[4774]: I1001 13:59:24.788032 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" event={"ID":"2c06ab62-1df7-4af5-9ebd-6e74545adff8","Type":"ContainerStarted","Data":"83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813"} Oct 01 13:59:24 crc kubenswrapper[4774]: I1001 13:59:24.815128 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" podStartSLOduration=1.8151074889999999 podStartE2EDuration="1.815107489s" podCreationTimestamp="2025-10-01 13:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:59:24.808809931 +0000 UTC m=+1336.698440548" watchObservedRunningTime="2025-10-01 13:59:24.815107489 +0000 UTC m=+1336.704738096" Oct 01 13:59:25 crc kubenswrapper[4774]: I1001 13:59:25.798164 4774 generic.go:334] "Generic (PLEG): container finished" podID="2c06ab62-1df7-4af5-9ebd-6e74545adff8" containerID="ad5a25121cb41282a5ee5c003cdb8b6486fa6ef864d9e86a35db1b7e11cd6b87" exitCode=0 Oct 01 13:59:25 crc kubenswrapper[4774]: I1001 13:59:25.798230 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" event={"ID":"2c06ab62-1df7-4af5-9ebd-6e74545adff8","Type":"ContainerDied","Data":"ad5a25121cb41282a5ee5c003cdb8b6486fa6ef864d9e86a35db1b7e11cd6b87"} Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.134009 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.313831 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data\") pod \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.313999 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48fq6\" (UniqueName: \"kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6\") pod \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\" (UID: \"2c06ab62-1df7-4af5-9ebd-6e74545adff8\") " Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.319836 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6" (OuterVolumeSpecName: "kube-api-access-48fq6") pod "2c06ab62-1df7-4af5-9ebd-6e74545adff8" (UID: "2c06ab62-1df7-4af5-9ebd-6e74545adff8"). InnerVolumeSpecName "kube-api-access-48fq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.356631 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data" (OuterVolumeSpecName: "config-data") pod "2c06ab62-1df7-4af5-9ebd-6e74545adff8" (UID: "2c06ab62-1df7-4af5-9ebd-6e74545adff8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.416338 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48fq6\" (UniqueName: \"kubernetes.io/projected/2c06ab62-1df7-4af5-9ebd-6e74545adff8-kube-api-access-48fq6\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.416658 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c06ab62-1df7-4af5-9ebd-6e74545adff8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.820271 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" event={"ID":"2c06ab62-1df7-4af5-9ebd-6e74545adff8","Type":"ContainerDied","Data":"83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813"} Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.820744 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c6b0ece71ea9216e494980bc787fa86bf93119dc9a02af0163952f3e253813" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.820369 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-db-sync-nsvjs" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.994127 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tfjw7"] Oct 01 13:59:27 crc kubenswrapper[4774]: E1001 13:59:27.994404 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c06ab62-1df7-4af5-9ebd-6e74545adff8" containerName="keystone-db-sync" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.994426 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c06ab62-1df7-4af5-9ebd-6e74545adff8" containerName="keystone-db-sync" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.994590 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c06ab62-1df7-4af5-9ebd-6e74545adff8" containerName="keystone-db-sync" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.995081 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.996848 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.997100 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.998338 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:59:27 crc kubenswrapper[4774]: I1001 13:59:27.998890 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vtrlv" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.014271 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tfjw7"] Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.125721 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvs26\" (UniqueName: \"kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.125974 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.126032 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.126058 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.126122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.227825 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.227965 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvs26\" (UniqueName: \"kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.228023 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.228098 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.228143 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.233168 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.235103 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.235255 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.237046 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.271733 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvs26\" (UniqueName: \"kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26\") pod \"keystone-bootstrap-tfjw7\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.315445 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.602837 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tfjw7"] Oct 01 13:59:28 crc kubenswrapper[4774]: W1001 13:59:28.607072 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd985f69d_6361_44f9_8565_8811bace1c81.slice/crio-989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8 WatchSource:0}: Error finding container 989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8: Status 404 returned error can't find the container with id 989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8 Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.828543 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" event={"ID":"d985f69d-6361-44f9-8565-8811bace1c81","Type":"ContainerStarted","Data":"0a99b08a95c7f6ad3095ca63d6e1a4ddfe9c2184ce6772538491c2af3cede00d"} Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.828584 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" event={"ID":"d985f69d-6361-44f9-8565-8811bace1c81","Type":"ContainerStarted","Data":"989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8"} Oct 01 13:59:28 crc kubenswrapper[4774]: I1001 13:59:28.852923 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" podStartSLOduration=1.8529069969999998 podStartE2EDuration="1.852906997s" podCreationTimestamp="2025-10-01 13:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:59:28.85112663 +0000 UTC m=+1340.740757247" watchObservedRunningTime="2025-10-01 13:59:28.852906997 +0000 UTC m=+1340.742537584" Oct 01 13:59:31 crc kubenswrapper[4774]: I1001 13:59:31.860100 4774 generic.go:334] "Generic (PLEG): container finished" podID="d985f69d-6361-44f9-8565-8811bace1c81" containerID="0a99b08a95c7f6ad3095ca63d6e1a4ddfe9c2184ce6772538491c2af3cede00d" exitCode=0 Oct 01 13:59:31 crc kubenswrapper[4774]: I1001 13:59:31.860176 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" event={"ID":"d985f69d-6361-44f9-8565-8811bace1c81","Type":"ContainerDied","Data":"0a99b08a95c7f6ad3095ca63d6e1a4ddfe9c2184ce6772538491c2af3cede00d"} Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.215254 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.315660 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts\") pod \"d985f69d-6361-44f9-8565-8811bace1c81\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.315812 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data\") pod \"d985f69d-6361-44f9-8565-8811bace1c81\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.315881 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys\") pod \"d985f69d-6361-44f9-8565-8811bace1c81\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.315990 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys\") pod \"d985f69d-6361-44f9-8565-8811bace1c81\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.316107 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvs26\" (UniqueName: \"kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26\") pod \"d985f69d-6361-44f9-8565-8811bace1c81\" (UID: \"d985f69d-6361-44f9-8565-8811bace1c81\") " Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.322910 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts" (OuterVolumeSpecName: "scripts") pod "d985f69d-6361-44f9-8565-8811bace1c81" (UID: "d985f69d-6361-44f9-8565-8811bace1c81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.322977 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26" (OuterVolumeSpecName: "kube-api-access-rvs26") pod "d985f69d-6361-44f9-8565-8811bace1c81" (UID: "d985f69d-6361-44f9-8565-8811bace1c81"). InnerVolumeSpecName "kube-api-access-rvs26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.323861 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d985f69d-6361-44f9-8565-8811bace1c81" (UID: "d985f69d-6361-44f9-8565-8811bace1c81"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.324636 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d985f69d-6361-44f9-8565-8811bace1c81" (UID: "d985f69d-6361-44f9-8565-8811bace1c81"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.338315 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data" (OuterVolumeSpecName: "config-data") pod "d985f69d-6361-44f9-8565-8811bace1c81" (UID: "d985f69d-6361-44f9-8565-8811bace1c81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.417752 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvs26\" (UniqueName: \"kubernetes.io/projected/d985f69d-6361-44f9-8565-8811bace1c81-kube-api-access-rvs26\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.417808 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.417828 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.417848 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.417868 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d985f69d-6361-44f9-8565-8811bace1c81-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.880977 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" event={"ID":"d985f69d-6361-44f9-8565-8811bace1c81","Type":"ContainerDied","Data":"989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8"} Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.881036 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="989ee90821d5a9726919accdab31edcdd9d35d01187050b76c6804acc55a8ee8" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.881065 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-bootstrap-tfjw7" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.992786 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 13:59:33 crc kubenswrapper[4774]: E1001 13:59:33.993093 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985f69d-6361-44f9-8565-8811bace1c81" containerName="keystone-bootstrap" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.993108 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985f69d-6361-44f9-8565-8811bace1c81" containerName="keystone-bootstrap" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.993249 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985f69d-6361-44f9-8565-8811bace1c81" containerName="keystone-bootstrap" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.993743 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.995306 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-config-data" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.997294 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.998490 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-scripts" Oct 01 13:59:33 crc kubenswrapper[4774]: I1001 13:59:33.999634 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"keystone-keystone-dockercfg-vtrlv" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.014898 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.130292 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.130483 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.130577 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.130664 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s47zn\" (UniqueName: \"kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.130770 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.231645 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.231747 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.231805 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.231879 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s47zn\" (UniqueName: \"kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.231945 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.238436 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.239929 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.242040 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.243296 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.264211 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s47zn\" (UniqueName: \"kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn\") pod \"keystone-f66dcddc7-c4gg8\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.312189 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.523196 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.890053 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" event={"ID":"305cbbd8-b936-418b-9528-00d81715f080","Type":"ContainerStarted","Data":"42f6587bee42e963fe8c4a76b48e7d56ffb5525da05823dc61d6053869639214"} Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.890408 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" event={"ID":"305cbbd8-b936-418b-9528-00d81715f080","Type":"ContainerStarted","Data":"1389dbe3e66adfd95d0697ea0390aac6edb4e93b82bc4bfb3019df79330d6525"} Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.890442 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 13:59:34 crc kubenswrapper[4774]: I1001 13:59:34.918230 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" podStartSLOduration=1.918202724 podStartE2EDuration="1.918202724s" podCreationTimestamp="2025-10-01 13:59:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:59:34.913535019 +0000 UTC m=+1346.803165646" watchObservedRunningTime="2025-10-01 13:59:34.918202724 +0000 UTC m=+1346.807833361" Oct 01 13:59:36 crc kubenswrapper[4774]: I1001 13:59:36.919394 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" exitCode=1 Oct 01 13:59:36 crc kubenswrapper[4774]: I1001 13:59:36.919504 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56"} Oct 01 13:59:36 crc kubenswrapper[4774]: I1001 13:59:36.919955 4774 scope.go:117] "RemoveContainer" containerID="ab74f29ccfc972ea57e476df0c067adf30a5ef8c04bdea625d27d5a9a1162536" Oct 01 13:59:36 crc kubenswrapper[4774]: I1001 13:59:36.920792 4774 scope.go:117] "RemoveContainer" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" Oct 01 13:59:36 crc kubenswrapper[4774]: E1001 13:59:36.921148 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 13:59:38 crc kubenswrapper[4774]: I1001 13:59:38.290034 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:59:38 crc kubenswrapper[4774]: I1001 13:59:38.290082 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 13:59:38 crc kubenswrapper[4774]: I1001 13:59:38.290621 4774 scope.go:117] "RemoveContainer" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" Oct 01 13:59:38 crc kubenswrapper[4774]: E1001 13:59:38.290909 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 13:59:51 crc kubenswrapper[4774]: I1001 13:59:51.871253 4774 scope.go:117] "RemoveContainer" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" Oct 01 13:59:51 crc kubenswrapper[4774]: E1001 13:59:51.872177 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.145486 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht"] Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.148755 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.151903 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.153078 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.155444 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht"] Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.332690 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.332826 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.333023 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjqn\" (UniqueName: \"kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.434500 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjqn\" (UniqueName: \"kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.434632 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.434780 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.436736 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.447974 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.490081 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjqn\" (UniqueName: \"kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn\") pod \"collect-profiles-29322120-p55ht\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:00 crc kubenswrapper[4774]: I1001 14:00:00.775154 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:01 crc kubenswrapper[4774]: I1001 14:00:01.088274 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht"] Oct 01 14:00:01 crc kubenswrapper[4774]: I1001 14:00:01.132305 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" event={"ID":"427902e9-1bd7-4691-9b45-05094bffa63f","Type":"ContainerStarted","Data":"e0682cae71988748b0d3e88fe7f870b8b91aef68c28e8dd2869bc4a3f12e5b08"} Oct 01 14:00:02 crc kubenswrapper[4774]: I1001 14:00:02.144572 4774 generic.go:334] "Generic (PLEG): container finished" podID="427902e9-1bd7-4691-9b45-05094bffa63f" containerID="b244177c171942deaffbcbdeb8225205149dc1b61f95a1939bec6e979b1db5f2" exitCode=0 Oct 01 14:00:02 crc kubenswrapper[4774]: I1001 14:00:02.144667 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" event={"ID":"427902e9-1bd7-4691-9b45-05094bffa63f","Type":"ContainerDied","Data":"b244177c171942deaffbcbdeb8225205149dc1b61f95a1939bec6e979b1db5f2"} Oct 01 14:00:02 crc kubenswrapper[4774]: I1001 14:00:02.871249 4774 scope.go:117] "RemoveContainer" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.159699 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a"} Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.160076 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.467966 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.583790 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume\") pod \"427902e9-1bd7-4691-9b45-05094bffa63f\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.583938 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjqn\" (UniqueName: \"kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn\") pod \"427902e9-1bd7-4691-9b45-05094bffa63f\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.583988 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume\") pod \"427902e9-1bd7-4691-9b45-05094bffa63f\" (UID: \"427902e9-1bd7-4691-9b45-05094bffa63f\") " Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.584729 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume" (OuterVolumeSpecName: "config-volume") pod "427902e9-1bd7-4691-9b45-05094bffa63f" (UID: "427902e9-1bd7-4691-9b45-05094bffa63f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.589575 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "427902e9-1bd7-4691-9b45-05094bffa63f" (UID: "427902e9-1bd7-4691-9b45-05094bffa63f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.591134 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn" (OuterVolumeSpecName: "kube-api-access-sqjqn") pod "427902e9-1bd7-4691-9b45-05094bffa63f" (UID: "427902e9-1bd7-4691-9b45-05094bffa63f"). InnerVolumeSpecName "kube-api-access-sqjqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.685550 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjqn\" (UniqueName: \"kubernetes.io/projected/427902e9-1bd7-4691-9b45-05094bffa63f-kube-api-access-sqjqn\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.685578 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427902e9-1bd7-4691-9b45-05094bffa63f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4774]: I1001 14:00:03.685586 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427902e9-1bd7-4691-9b45-05094bffa63f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:04 crc kubenswrapper[4774]: I1001 14:00:04.182047 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" Oct 01 14:00:04 crc kubenswrapper[4774]: I1001 14:00:04.182047 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-p55ht" event={"ID":"427902e9-1bd7-4691-9b45-05094bffa63f","Type":"ContainerDied","Data":"e0682cae71988748b0d3e88fe7f870b8b91aef68c28e8dd2869bc4a3f12e5b08"} Oct 01 14:00:04 crc kubenswrapper[4774]: I1001 14:00:04.184874 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0682cae71988748b0d3e88fe7f870b8b91aef68c28e8dd2869bc4a3f12e5b08" Oct 01 14:00:05 crc kubenswrapper[4774]: I1001 14:00:05.712393 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 14:00:08 crc kubenswrapper[4774]: I1001 14:00:08.295390 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:00:09 crc kubenswrapper[4774]: I1001 14:00:09.612264 4774 scope.go:117] "RemoveContainer" containerID="66d3cb431714dcee9d8232f0d0b5a4dbf07b26c4f11b31f6ecc2fcee8b114555" Oct 01 14:00:09 crc kubenswrapper[4774]: I1001 14:00:09.643856 4774 scope.go:117] "RemoveContainer" containerID="29b807d06e04e6131a92e43adae71ecd618b89a9e9a595136e8983e970622202" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.797216 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-nsvjs"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.802073 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tfjw7"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.811366 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-bootstrap-tfjw7"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.819738 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-sync-nsvjs"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.842612 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/keystonecbe7-account-delete-wdl97"] Oct 01 14:00:21 crc kubenswrapper[4774]: E1001 14:00:21.843132 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427902e9-1bd7-4691-9b45-05094bffa63f" containerName="collect-profiles" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.843236 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="427902e9-1bd7-4691-9b45-05094bffa63f" containerName="collect-profiles" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.843491 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="427902e9-1bd7-4691-9b45-05094bffa63f" containerName="collect-profiles" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.844102 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.849995 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.850249 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" podUID="305cbbd8-b936-418b-9528-00d81715f080" containerName="keystone-api" containerID="cri-o://42f6587bee42e963fe8c4a76b48e7d56ffb5525da05823dc61d6053869639214" gracePeriod=30 Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.857007 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonecbe7-account-delete-wdl97"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.900775 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgj5w\" (UniqueName: \"kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w\") pod \"keystonecbe7-account-delete-wdl97\" (UID: \"3c6f1b72-095b-4995-bd76-de826b241e31\") " pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.907752 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.909023 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.911979 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"openstack-config-secret" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.912165 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"keystone-kuttl-tests"/"openstack-config" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.912368 4774 reflector.go:368] Caches populated for *v1.Secret from object-"keystone-kuttl-tests"/"default-dockercfg-cxrkt" Oct 01 14:00:21 crc kubenswrapper[4774]: I1001 14:00:21.920658 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.002249 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgj5w\" (UniqueName: \"kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w\") pod \"keystonecbe7-account-delete-wdl97\" (UID: \"3c6f1b72-095b-4995-bd76-de826b241e31\") " pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.002573 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.002603 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.002807 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgzvf\" (UniqueName: \"kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.023381 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgj5w\" (UniqueName: \"kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w\") pod \"keystonecbe7-account-delete-wdl97\" (UID: \"3c6f1b72-095b-4995-bd76-de826b241e31\") " pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.104092 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.104160 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.104205 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgzvf\" (UniqueName: \"kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.104233 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.104327 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:22.604301817 +0000 UTC m=+1394.493932454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.104381 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.104477 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:22.604432223 +0000 UTC m=+1394.494062860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.136140 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgzvf\" (UniqueName: \"kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.164075 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.611681 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.612132 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.611926 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.612242 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.612435 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:23.612394378 +0000 UTC m=+1395.502025005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:22 crc kubenswrapper[4774]: E1001 14:00:22.612538 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:23.612505213 +0000 UTC m=+1395.502135850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.647595 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["keystone-kuttl-tests/keystonecbe7-account-delete-wdl97"] Oct 01 14:00:22 crc kubenswrapper[4774]: W1001 14:00:22.653225 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c6f1b72_095b_4995_bd76_de826b241e31.slice/crio-b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b WatchSource:0}: Error finding container b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b: Status 404 returned error can't find the container with id b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.881164 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c06ab62-1df7-4af5-9ebd-6e74545adff8" path="/var/lib/kubelet/pods/2c06ab62-1df7-4af5-9ebd-6e74545adff8/volumes" Oct 01 14:00:22 crc kubenswrapper[4774]: I1001 14:00:22.882408 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d985f69d-6361-44f9-8565-8811bace1c81" path="/var/lib/kubelet/pods/d985f69d-6361-44f9-8565-8811bace1c81/volumes" Oct 01 14:00:23 crc kubenswrapper[4774]: I1001 14:00:23.361361 4774 generic.go:334] "Generic (PLEG): container finished" podID="3c6f1b72-095b-4995-bd76-de826b241e31" containerID="877bdc3a3ad8da2625d815d1333f32ae489849e04bffc9cc444df2b6463a72d4" exitCode=0 Oct 01 14:00:23 crc kubenswrapper[4774]: I1001 14:00:23.361521 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" event={"ID":"3c6f1b72-095b-4995-bd76-de826b241e31","Type":"ContainerDied","Data":"877bdc3a3ad8da2625d815d1333f32ae489849e04bffc9cc444df2b6463a72d4"} Oct 01 14:00:23 crc kubenswrapper[4774]: I1001 14:00:23.361881 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" event={"ID":"3c6f1b72-095b-4995-bd76-de826b241e31","Type":"ContainerStarted","Data":"b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b"} Oct 01 14:00:23 crc kubenswrapper[4774]: I1001 14:00:23.627630 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:23 crc kubenswrapper[4774]: E1001 14:00:23.627856 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:23 crc kubenswrapper[4774]: E1001 14:00:23.628533 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:25.628504576 +0000 UTC m=+1397.518135213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:23 crc kubenswrapper[4774]: I1001 14:00:23.628755 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:23 crc kubenswrapper[4774]: E1001 14:00:23.629013 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:23 crc kubenswrapper[4774]: E1001 14:00:23.629131 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:25.629102473 +0000 UTC m=+1397.518733080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:24 crc kubenswrapper[4774]: I1001 14:00:24.747931 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:24 crc kubenswrapper[4774]: I1001 14:00:24.847802 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgj5w\" (UniqueName: \"kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w\") pod \"3c6f1b72-095b-4995-bd76-de826b241e31\" (UID: \"3c6f1b72-095b-4995-bd76-de826b241e31\") " Oct 01 14:00:24 crc kubenswrapper[4774]: I1001 14:00:24.854720 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w" (OuterVolumeSpecName: "kube-api-access-wgj5w") pod "3c6f1b72-095b-4995-bd76-de826b241e31" (UID: "3c6f1b72-095b-4995-bd76-de826b241e31"). InnerVolumeSpecName "kube-api-access-wgj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:24 crc kubenswrapper[4774]: I1001 14:00:24.950028 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgj5w\" (UniqueName: \"kubernetes.io/projected/3c6f1b72-095b-4995-bd76-de826b241e31-kube-api-access-wgj5w\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.384332 4774 generic.go:334] "Generic (PLEG): container finished" podID="305cbbd8-b936-418b-9528-00d81715f080" containerID="42f6587bee42e963fe8c4a76b48e7d56ffb5525da05823dc61d6053869639214" exitCode=0 Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.384505 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" event={"ID":"305cbbd8-b936-418b-9528-00d81715f080","Type":"ContainerDied","Data":"42f6587bee42e963fe8c4a76b48e7d56ffb5525da05823dc61d6053869639214"} Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.384548 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" event={"ID":"305cbbd8-b936-418b-9528-00d81715f080","Type":"ContainerDied","Data":"1389dbe3e66adfd95d0697ea0390aac6edb4e93b82bc4bfb3019df79330d6525"} Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.384575 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1389dbe3e66adfd95d0697ea0390aac6edb4e93b82bc4bfb3019df79330d6525" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.386645 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" event={"ID":"3c6f1b72-095b-4995-bd76-de826b241e31","Type":"ContainerDied","Data":"b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b"} Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.386683 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b70efcdbd5d5c9662239412db12055623435907d677708d893c15a8b7802785b" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.386710 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystonecbe7-account-delete-wdl97" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.389967 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.458403 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts\") pod \"305cbbd8-b936-418b-9528-00d81715f080\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.458492 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys\") pod \"305cbbd8-b936-418b-9528-00d81715f080\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.458543 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys\") pod \"305cbbd8-b936-418b-9528-00d81715f080\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.458634 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s47zn\" (UniqueName: \"kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn\") pod \"305cbbd8-b936-418b-9528-00d81715f080\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.458666 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data\") pod \"305cbbd8-b936-418b-9528-00d81715f080\" (UID: \"305cbbd8-b936-418b-9528-00d81715f080\") " Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.461731 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "305cbbd8-b936-418b-9528-00d81715f080" (UID: "305cbbd8-b936-418b-9528-00d81715f080"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.462695 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts" (OuterVolumeSpecName: "scripts") pod "305cbbd8-b936-418b-9528-00d81715f080" (UID: "305cbbd8-b936-418b-9528-00d81715f080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.463602 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "305cbbd8-b936-418b-9528-00d81715f080" (UID: "305cbbd8-b936-418b-9528-00d81715f080"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.464623 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn" (OuterVolumeSpecName: "kube-api-access-s47zn") pod "305cbbd8-b936-418b-9528-00d81715f080" (UID: "305cbbd8-b936-418b-9528-00d81715f080"). InnerVolumeSpecName "kube-api-access-s47zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.488044 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data" (OuterVolumeSpecName: "config-data") pod "305cbbd8-b936-418b-9528-00d81715f080" (UID: "305cbbd8-b936-418b-9528-00d81715f080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.560760 4774 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.560805 4774 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.560824 4774 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.560842 4774 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/305cbbd8-b936-418b-9528-00d81715f080-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.560860 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s47zn\" (UniqueName: \"kubernetes.io/projected/305cbbd8-b936-418b-9528-00d81715f080-kube-api-access-s47zn\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.662524 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:25 crc kubenswrapper[4774]: I1001 14:00:25.662595 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:25 crc kubenswrapper[4774]: E1001 14:00:25.662724 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:25 crc kubenswrapper[4774]: E1001 14:00:25.662803 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:29.662779316 +0000 UTC m=+1401.552409953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:25 crc kubenswrapper[4774]: E1001 14:00:25.662838 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:25 crc kubenswrapper[4774]: E1001 14:00:25.662894 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:29.66287699 +0000 UTC m=+1401.552507617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.395978 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/keystone-f66dcddc7-c4gg8" Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.435056 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.439034 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-f66dcddc7-c4gg8"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.863415 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cp2bd"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.895362 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305cbbd8-b936-418b-9528-00d81715f080" path="/var/lib/kubelet/pods/305cbbd8-b936-418b-9528-00d81715f080/volumes" Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.896279 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-db-create-cp2bd"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.896336 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.904315 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystone-cbe7-account-create-rkg6j"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.913241 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/keystonecbe7-account-delete-wdl97"] Oct 01 14:00:26 crc kubenswrapper[4774]: I1001 14:00:26.925936 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/keystonecbe7-account-delete-wdl97"] Oct 01 14:00:28 crc kubenswrapper[4774]: I1001 14:00:28.885178 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e9c8de-7c38-46cc-93e4-7dd8008eaed7" path="/var/lib/kubelet/pods/31e9c8de-7c38-46cc-93e4-7dd8008eaed7/volumes" Oct 01 14:00:28 crc kubenswrapper[4774]: I1001 14:00:28.886774 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6f1b72-095b-4995-bd76-de826b241e31" path="/var/lib/kubelet/pods/3c6f1b72-095b-4995-bd76-de826b241e31/volumes" Oct 01 14:00:28 crc kubenswrapper[4774]: I1001 14:00:28.887626 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a15ed2-6de1-46b5-9827-cdc086c1286f" path="/var/lib/kubelet/pods/82a15ed2-6de1-46b5-9827-cdc086c1286f/volumes" Oct 01 14:00:29 crc kubenswrapper[4774]: I1001 14:00:29.724585 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:29 crc kubenswrapper[4774]: I1001 14:00:29.724653 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:29 crc kubenswrapper[4774]: E1001 14:00:29.724869 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:29 crc kubenswrapper[4774]: E1001 14:00:29.724939 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:37.724919602 +0000 UTC m=+1409.614550199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:29 crc kubenswrapper[4774]: E1001 14:00:29.725022 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:29 crc kubenswrapper[4774]: E1001 14:00:29.725183 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:37.725150548 +0000 UTC m=+1409.614781185 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:37 crc kubenswrapper[4774]: I1001 14:00:37.765084 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:37 crc kubenswrapper[4774]: I1001 14:00:37.765797 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:37 crc kubenswrapper[4774]: E1001 14:00:37.765282 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:37 crc kubenswrapper[4774]: E1001 14:00:37.765959 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:53.765930572 +0000 UTC m=+1425.655561209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:37 crc kubenswrapper[4774]: E1001 14:00:37.765995 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:37 crc kubenswrapper[4774]: E1001 14:00:37.766066 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:00:53.766045975 +0000 UTC m=+1425.655676612 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:53 crc kubenswrapper[4774]: I1001 14:00:53.844419 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:53 crc kubenswrapper[4774]: I1001 14:00:53.846635 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:00:53 crc kubenswrapper[4774]: E1001 14:00:53.844704 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:00:53 crc kubenswrapper[4774]: E1001 14:00:53.847051 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:01:25.847017469 +0000 UTC m=+1457.736648096 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:00:53 crc kubenswrapper[4774]: E1001 14:00:53.846853 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:00:53 crc kubenswrapper[4774]: E1001 14:00:53.847412 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:01:25.847381889 +0000 UTC m=+1457.737012526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.103658 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:00:56 crc kubenswrapper[4774]: E1001 14:00:56.104416 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305cbbd8-b936-418b-9528-00d81715f080" containerName="keystone-api" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.104436 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="305cbbd8-b936-418b-9528-00d81715f080" containerName="keystone-api" Oct 01 14:00:56 crc kubenswrapper[4774]: E1001 14:00:56.104494 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6f1b72-095b-4995-bd76-de826b241e31" containerName="mariadb-account-delete" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.104506 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6f1b72-095b-4995-bd76-de826b241e31" containerName="mariadb-account-delete" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.104709 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6f1b72-095b-4995-bd76-de826b241e31" containerName="mariadb-account-delete" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.104733 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="305cbbd8-b936-418b-9528-00d81715f080" containerName="keystone-api" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.107026 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.111582 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.188341 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9qr\" (UniqueName: \"kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.188409 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.188491 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.289410 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.289521 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9qr\" (UniqueName: \"kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.289548 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.289987 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.290212 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.315474 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9qr\" (UniqueName: \"kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr\") pod \"community-operators-pwjgr\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.428252 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:00:56 crc kubenswrapper[4774]: I1001 14:00:56.707877 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:00:57 crc kubenswrapper[4774]: I1001 14:00:57.673130 4774 generic.go:334] "Generic (PLEG): container finished" podID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerID="19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c" exitCode=0 Oct 01 14:00:57 crc kubenswrapper[4774]: I1001 14:00:57.673261 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerDied","Data":"19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c"} Oct 01 14:00:57 crc kubenswrapper[4774]: I1001 14:00:57.673577 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerStarted","Data":"68922cc01c02b3cfee2366e0d97a0fc24a79dd126e193ae410bb812613edfbc6"} Oct 01 14:00:57 crc kubenswrapper[4774]: I1001 14:00:57.675905 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:01:00 crc kubenswrapper[4774]: I1001 14:01:00.714238 4774 generic.go:334] "Generic (PLEG): container finished" podID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerID="7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab" exitCode=0 Oct 01 14:01:00 crc kubenswrapper[4774]: I1001 14:01:00.714344 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerDied","Data":"7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab"} Oct 01 14:01:05 crc kubenswrapper[4774]: I1001 14:01:05.763978 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerStarted","Data":"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae"} Oct 01 14:01:05 crc kubenswrapper[4774]: I1001 14:01:05.782648 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pwjgr" podStartSLOduration=2.239735476 podStartE2EDuration="9.782623991s" podCreationTimestamp="2025-10-01 14:00:56 +0000 UTC" firstStartedPulling="2025-10-01 14:00:57.675379895 +0000 UTC m=+1429.565010532" lastFinishedPulling="2025-10-01 14:01:05.21826845 +0000 UTC m=+1437.107899047" observedRunningTime="2025-10-01 14:01:05.78185412 +0000 UTC m=+1437.671484717" watchObservedRunningTime="2025-10-01 14:01:05.782623991 +0000 UTC m=+1437.672254598" Oct 01 14:01:06 crc kubenswrapper[4774]: I1001 14:01:06.428395 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:06 crc kubenswrapper[4774]: I1001 14:01:06.428659 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:07 crc kubenswrapper[4774]: I1001 14:01:07.271224 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:01:07 crc kubenswrapper[4774]: I1001 14:01:07.271685 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:01:07 crc kubenswrapper[4774]: I1001 14:01:07.498604 4774 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-pwjgr" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="registry-server" probeResult="failure" output=< Oct 01 14:01:07 crc kubenswrapper[4774]: timeout: failed to connect service ":50051" within 1s Oct 01 14:01:07 crc kubenswrapper[4774]: > Oct 01 14:01:09 crc kubenswrapper[4774]: I1001 14:01:09.743769 4774 scope.go:117] "RemoveContainer" containerID="53ab217bd33ad948ae0545c8996270e8906b461cca5799d8511fb886d0e8bf04" Oct 01 14:01:09 crc kubenswrapper[4774]: I1001 14:01:09.775439 4774 scope.go:117] "RemoveContainer" containerID="a8ec439546cd6808b3a400a08dcc184edb5034a7d18a9540a1acf3799c6cb7df" Oct 01 14:01:09 crc kubenswrapper[4774]: I1001 14:01:09.841025 4774 scope.go:117] "RemoveContainer" containerID="5bd4dac455924a756e50e16687adb78935bb529ea0340011064951b3f37d5fe1" Oct 01 14:01:09 crc kubenswrapper[4774]: I1001 14:01:09.885712 4774 scope.go:117] "RemoveContainer" containerID="74a2ad2e5a34884c71e59f754d976f350d43dc2193a5a920302172f89b216d02" Oct 01 14:01:16 crc kubenswrapper[4774]: I1001 14:01:16.499708 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:16 crc kubenswrapper[4774]: I1001 14:01:16.578169 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:16 crc kubenswrapper[4774]: I1001 14:01:16.748169 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:01:17 crc kubenswrapper[4774]: I1001 14:01:17.869576 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pwjgr" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="registry-server" containerID="cri-o://7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae" gracePeriod=2 Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.366986 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.474190 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities\") pod \"b732847f-3b84-436a-8617-1dedd76c9eb8\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.474228 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content\") pod \"b732847f-3b84-436a-8617-1dedd76c9eb8\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.474270 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx9qr\" (UniqueName: \"kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr\") pod \"b732847f-3b84-436a-8617-1dedd76c9eb8\" (UID: \"b732847f-3b84-436a-8617-1dedd76c9eb8\") " Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.475217 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities" (OuterVolumeSpecName: "utilities") pod "b732847f-3b84-436a-8617-1dedd76c9eb8" (UID: "b732847f-3b84-436a-8617-1dedd76c9eb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.480001 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr" (OuterVolumeSpecName: "kube-api-access-qx9qr") pod "b732847f-3b84-436a-8617-1dedd76c9eb8" (UID: "b732847f-3b84-436a-8617-1dedd76c9eb8"). InnerVolumeSpecName "kube-api-access-qx9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.529664 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b732847f-3b84-436a-8617-1dedd76c9eb8" (UID: "b732847f-3b84-436a-8617-1dedd76c9eb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.588605 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.588662 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b732847f-3b84-436a-8617-1dedd76c9eb8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.588689 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx9qr\" (UniqueName: \"kubernetes.io/projected/b732847f-3b84-436a-8617-1dedd76c9eb8-kube-api-access-qx9qr\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.885022 4774 generic.go:334] "Generic (PLEG): container finished" podID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerID="7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae" exitCode=0 Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.885171 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwjgr" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.889668 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerDied","Data":"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae"} Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.889729 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwjgr" event={"ID":"b732847f-3b84-436a-8617-1dedd76c9eb8","Type":"ContainerDied","Data":"68922cc01c02b3cfee2366e0d97a0fc24a79dd126e193ae410bb812613edfbc6"} Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.889759 4774 scope.go:117] "RemoveContainer" containerID="7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.921478 4774 scope.go:117] "RemoveContainer" containerID="7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.944683 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.951581 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pwjgr"] Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.957770 4774 scope.go:117] "RemoveContainer" containerID="19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.978053 4774 scope.go:117] "RemoveContainer" containerID="7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae" Oct 01 14:01:18 crc kubenswrapper[4774]: E1001 14:01:18.978925 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae\": container with ID starting with 7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae not found: ID does not exist" containerID="7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.978979 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae"} err="failed to get container status \"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae\": rpc error: code = NotFound desc = could not find container \"7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae\": container with ID starting with 7cc39f08c02acb12ca039040adb2cbb6d531aeaf110046befaccc40a95e105ae not found: ID does not exist" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.979078 4774 scope.go:117] "RemoveContainer" containerID="7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab" Oct 01 14:01:18 crc kubenswrapper[4774]: E1001 14:01:18.979560 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab\": container with ID starting with 7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab not found: ID does not exist" containerID="7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.979644 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab"} err="failed to get container status \"7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab\": rpc error: code = NotFound desc = could not find container \"7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab\": container with ID starting with 7404f6af97fa515c412528b50de4adab51380915c6125afcf6c89715234427ab not found: ID does not exist" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.979692 4774 scope.go:117] "RemoveContainer" containerID="19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c" Oct 01 14:01:18 crc kubenswrapper[4774]: E1001 14:01:18.980143 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c\": container with ID starting with 19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c not found: ID does not exist" containerID="19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c" Oct 01 14:01:18 crc kubenswrapper[4774]: I1001 14:01:18.980193 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c"} err="failed to get container status \"19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c\": rpc error: code = NotFound desc = could not find container \"19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c\": container with ID starting with 19520359fbbe090171c56ab9392447b8281c7fb2775d4ab31277f916955b603c not found: ID does not exist" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.362840 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:19 crc kubenswrapper[4774]: E1001 14:01:19.363197 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="extract-content" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.363224 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="extract-content" Oct 01 14:01:19 crc kubenswrapper[4774]: E1001 14:01:19.363258 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="extract-utilities" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.363271 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="extract-utilities" Oct 01 14:01:19 crc kubenswrapper[4774]: E1001 14:01:19.363291 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="registry-server" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.363304 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="registry-server" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.363575 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" containerName="registry-server" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.365049 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.382997 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.503263 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.503350 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.503603 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkjzf\" (UniqueName: \"kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.605079 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkjzf\" (UniqueName: \"kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.605288 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.605345 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.605934 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.606065 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.633778 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkjzf\" (UniqueName: \"kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf\") pod \"redhat-operators-z8jbw\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:19 crc kubenswrapper[4774]: I1001 14:01:19.691874 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:20 crc kubenswrapper[4774]: I1001 14:01:20.164008 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:20 crc kubenswrapper[4774]: W1001 14:01:20.175743 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf1730e_cfec_4715_90c9_9fcd55b1591c.slice/crio-5a5b212a9ac2c7829dc0b9acfe4084007036ee5f921f6aef9ca2d5fe8ce9e21b WatchSource:0}: Error finding container 5a5b212a9ac2c7829dc0b9acfe4084007036ee5f921f6aef9ca2d5fe8ce9e21b: Status 404 returned error can't find the container with id 5a5b212a9ac2c7829dc0b9acfe4084007036ee5f921f6aef9ca2d5fe8ce9e21b Oct 01 14:01:20 crc kubenswrapper[4774]: I1001 14:01:20.893692 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b732847f-3b84-436a-8617-1dedd76c9eb8" path="/var/lib/kubelet/pods/b732847f-3b84-436a-8617-1dedd76c9eb8/volumes" Oct 01 14:01:20 crc kubenswrapper[4774]: I1001 14:01:20.908415 4774 generic.go:334] "Generic (PLEG): container finished" podID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerID="cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499" exitCode=0 Oct 01 14:01:20 crc kubenswrapper[4774]: I1001 14:01:20.908480 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerDied","Data":"cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499"} Oct 01 14:01:20 crc kubenswrapper[4774]: I1001 14:01:20.908513 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerStarted","Data":"5a5b212a9ac2c7829dc0b9acfe4084007036ee5f921f6aef9ca2d5fe8ce9e21b"} Oct 01 14:01:21 crc kubenswrapper[4774]: I1001 14:01:21.922819 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerStarted","Data":"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797"} Oct 01 14:01:22 crc kubenswrapper[4774]: I1001 14:01:22.937845 4774 generic.go:334] "Generic (PLEG): container finished" podID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerID="53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797" exitCode=0 Oct 01 14:01:22 crc kubenswrapper[4774]: I1001 14:01:22.938049 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerDied","Data":"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797"} Oct 01 14:01:23 crc kubenswrapper[4774]: I1001 14:01:23.952402 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerStarted","Data":"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00"} Oct 01 14:01:23 crc kubenswrapper[4774]: I1001 14:01:23.980410 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z8jbw" podStartSLOduration=2.343486991 podStartE2EDuration="4.980387831s" podCreationTimestamp="2025-10-01 14:01:19 +0000 UTC" firstStartedPulling="2025-10-01 14:01:20.910947844 +0000 UTC m=+1452.800578481" lastFinishedPulling="2025-10-01 14:01:23.547848694 +0000 UTC m=+1455.437479321" observedRunningTime="2025-10-01 14:01:23.976928198 +0000 UTC m=+1455.866558835" watchObservedRunningTime="2025-10-01 14:01:23.980387831 +0000 UTC m=+1455.870018438" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.160962 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.162386 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.179817 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.296997 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.297227 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.297574 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn978\" (UniqueName: \"kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.399719 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.399846 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn978\" (UniqueName: \"kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.399889 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.400216 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.400325 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.417090 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn978\" (UniqueName: \"kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978\") pod \"certified-operators-22gtx\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.517851 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.766663 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.907065 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.907107 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:01:25 crc kubenswrapper[4774]: E1001 14:01:25.907246 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:01:25 crc kubenswrapper[4774]: E1001 14:01:25.907278 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:01:25 crc kubenswrapper[4774]: E1001 14:01:25.907327 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:02:29.907309372 +0000 UTC m=+1521.796939969 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:01:25 crc kubenswrapper[4774]: E1001 14:01:25.907345 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:02:29.907338672 +0000 UTC m=+1521.796969269 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.967847 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerStarted","Data":"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b"} Oct 01 14:01:25 crc kubenswrapper[4774]: I1001 14:01:25.968185 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerStarted","Data":"fc8a3cab5f20495e202fcd06bd1efd38133897f7371de4ff2a04e4e12345718e"} Oct 01 14:01:26 crc kubenswrapper[4774]: I1001 14:01:26.975830 4774 generic.go:334] "Generic (PLEG): container finished" podID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerID="d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b" exitCode=0 Oct 01 14:01:26 crc kubenswrapper[4774]: I1001 14:01:26.975874 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerDied","Data":"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b"} Oct 01 14:01:28 crc kubenswrapper[4774]: I1001 14:01:28.992086 4774 generic.go:334] "Generic (PLEG): container finished" podID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerID="cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4" exitCode=0 Oct 01 14:01:28 crc kubenswrapper[4774]: I1001 14:01:28.992177 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerDied","Data":"cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4"} Oct 01 14:01:29 crc kubenswrapper[4774]: I1001 14:01:29.692971 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:29 crc kubenswrapper[4774]: I1001 14:01:29.693286 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:29 crc kubenswrapper[4774]: I1001 14:01:29.735318 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:30 crc kubenswrapper[4774]: I1001 14:01:30.004698 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerStarted","Data":"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398"} Oct 01 14:01:30 crc kubenswrapper[4774]: I1001 14:01:30.037488 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-22gtx" podStartSLOduration=2.573216053 podStartE2EDuration="5.037435009s" podCreationTimestamp="2025-10-01 14:01:25 +0000 UTC" firstStartedPulling="2025-10-01 14:01:26.977352916 +0000 UTC m=+1458.866983513" lastFinishedPulling="2025-10-01 14:01:29.441571842 +0000 UTC m=+1461.331202469" observedRunningTime="2025-10-01 14:01:30.032241158 +0000 UTC m=+1461.921871795" watchObservedRunningTime="2025-10-01 14:01:30.037435009 +0000 UTC m=+1461.927065646" Oct 01 14:01:30 crc kubenswrapper[4774]: I1001 14:01:30.096303 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.149788 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.150714 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z8jbw" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="registry-server" containerID="cri-o://4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00" gracePeriod=2 Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.648344 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.741995 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities\") pod \"adf1730e-cfec-4715-90c9-9fcd55b1591c\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.742076 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkjzf\" (UniqueName: \"kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf\") pod \"adf1730e-cfec-4715-90c9-9fcd55b1591c\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.742120 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content\") pod \"adf1730e-cfec-4715-90c9-9fcd55b1591c\" (UID: \"adf1730e-cfec-4715-90c9-9fcd55b1591c\") " Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.743092 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities" (OuterVolumeSpecName: "utilities") pod "adf1730e-cfec-4715-90c9-9fcd55b1591c" (UID: "adf1730e-cfec-4715-90c9-9fcd55b1591c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.755712 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf" (OuterVolumeSpecName: "kube-api-access-gkjzf") pod "adf1730e-cfec-4715-90c9-9fcd55b1591c" (UID: "adf1730e-cfec-4715-90c9-9fcd55b1591c"). InnerVolumeSpecName "kube-api-access-gkjzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.844079 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:32 crc kubenswrapper[4774]: I1001 14:01:32.844125 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkjzf\" (UniqueName: \"kubernetes.io/projected/adf1730e-cfec-4715-90c9-9fcd55b1591c-kube-api-access-gkjzf\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.039550 4774 generic.go:334] "Generic (PLEG): container finished" podID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerID="4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00" exitCode=0 Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.039627 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerDied","Data":"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00"} Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.039638 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8jbw" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.039689 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8jbw" event={"ID":"adf1730e-cfec-4715-90c9-9fcd55b1591c","Type":"ContainerDied","Data":"5a5b212a9ac2c7829dc0b9acfe4084007036ee5f921f6aef9ca2d5fe8ce9e21b"} Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.039732 4774 scope.go:117] "RemoveContainer" containerID="4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.071173 4774 scope.go:117] "RemoveContainer" containerID="53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.097652 4774 scope.go:117] "RemoveContainer" containerID="cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.141814 4774 scope.go:117] "RemoveContainer" containerID="4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00" Oct 01 14:01:33 crc kubenswrapper[4774]: E1001 14:01:33.142521 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00\": container with ID starting with 4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00 not found: ID does not exist" containerID="4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.142580 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00"} err="failed to get container status \"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00\": rpc error: code = NotFound desc = could not find container \"4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00\": container with ID starting with 4f88345f6800fd7254240504718cd53774fa022c9fe504d821320070c6dfee00 not found: ID does not exist" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.142614 4774 scope.go:117] "RemoveContainer" containerID="53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797" Oct 01 14:01:33 crc kubenswrapper[4774]: E1001 14:01:33.143327 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797\": container with ID starting with 53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797 not found: ID does not exist" containerID="53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.143384 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797"} err="failed to get container status \"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797\": rpc error: code = NotFound desc = could not find container \"53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797\": container with ID starting with 53138d7eb5927b3b0cf0574f70250242dc033352ef55a0cf039c6e7ef91e3797 not found: ID does not exist" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.143423 4774 scope.go:117] "RemoveContainer" containerID="cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499" Oct 01 14:01:33 crc kubenswrapper[4774]: E1001 14:01:33.143828 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499\": container with ID starting with cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499 not found: ID does not exist" containerID="cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.143873 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499"} err="failed to get container status \"cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499\": rpc error: code = NotFound desc = could not find container \"cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499\": container with ID starting with cc6c037c3f38c8bc99278c1f88697498691abebca8bcf7d4f8c97d59caa32499 not found: ID does not exist" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.573411 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adf1730e-cfec-4715-90c9-9fcd55b1591c" (UID: "adf1730e-cfec-4715-90c9-9fcd55b1591c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.658881 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adf1730e-cfec-4715-90c9-9fcd55b1591c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.691119 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:33 crc kubenswrapper[4774]: I1001 14:01:33.700184 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z8jbw"] Oct 01 14:01:34 crc kubenswrapper[4774]: I1001 14:01:34.877406 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" path="/var/lib/kubelet/pods/adf1730e-cfec-4715-90c9-9fcd55b1591c/volumes" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.168170 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:35 crc kubenswrapper[4774]: E1001 14:01:35.168562 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="extract-utilities" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.168593 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="extract-utilities" Oct 01 14:01:35 crc kubenswrapper[4774]: E1001 14:01:35.168614 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="extract-content" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.168627 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="extract-content" Oct 01 14:01:35 crc kubenswrapper[4774]: E1001 14:01:35.168660 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="registry-server" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.168686 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="registry-server" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.168892 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf1730e-cfec-4715-90c9-9fcd55b1591c" containerName="registry-server" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.170302 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.188071 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.226640 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlqbk\" (UniqueName: \"kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.226712 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.226922 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.328758 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlqbk\" (UniqueName: \"kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.328845 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.328909 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.329686 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.329792 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.363407 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlqbk\" (UniqueName: \"kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk\") pod \"redhat-marketplace-tl84h\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.498328 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.518945 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.519019 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.597116 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:35 crc kubenswrapper[4774]: I1001 14:01:35.810257 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:36 crc kubenswrapper[4774]: I1001 14:01:36.068729 4774 generic.go:334] "Generic (PLEG): container finished" podID="40e41281-dfb2-4126-806d-92671b997dcc" containerID="8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a" exitCode=0 Oct 01 14:01:36 crc kubenswrapper[4774]: I1001 14:01:36.068822 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerDied","Data":"8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a"} Oct 01 14:01:36 crc kubenswrapper[4774]: I1001 14:01:36.069107 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerStarted","Data":"1dab3930bfbba12e9a95dd76fe8eb84a6ae0aaa10804163a1eecbf3101a91259"} Oct 01 14:01:36 crc kubenswrapper[4774]: I1001 14:01:36.145149 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:37 crc kubenswrapper[4774]: I1001 14:01:37.270925 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:01:37 crc kubenswrapper[4774]: I1001 14:01:37.270980 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:01:38 crc kubenswrapper[4774]: I1001 14:01:38.092986 4774 generic.go:334] "Generic (PLEG): container finished" podID="40e41281-dfb2-4126-806d-92671b997dcc" containerID="f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a" exitCode=0 Oct 01 14:01:38 crc kubenswrapper[4774]: I1001 14:01:38.093087 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerDied","Data":"f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a"} Oct 01 14:01:38 crc kubenswrapper[4774]: I1001 14:01:38.551020 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:38 crc kubenswrapper[4774]: I1001 14:01:38.551495 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-22gtx" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="registry-server" containerID="cri-o://d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398" gracePeriod=2 Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.008985 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.088226 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn978\" (UniqueName: \"kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978\") pod \"b99bd6c6-9930-4b06-92c3-6311d2113a89\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.088303 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities\") pod \"b99bd6c6-9930-4b06-92c3-6311d2113a89\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.088380 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content\") pod \"b99bd6c6-9930-4b06-92c3-6311d2113a89\" (UID: \"b99bd6c6-9930-4b06-92c3-6311d2113a89\") " Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.089818 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities" (OuterVolumeSpecName: "utilities") pod "b99bd6c6-9930-4b06-92c3-6311d2113a89" (UID: "b99bd6c6-9930-4b06-92c3-6311d2113a89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.094254 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978" (OuterVolumeSpecName: "kube-api-access-cn978") pod "b99bd6c6-9930-4b06-92c3-6311d2113a89" (UID: "b99bd6c6-9930-4b06-92c3-6311d2113a89"). InnerVolumeSpecName "kube-api-access-cn978". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.105614 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerStarted","Data":"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5"} Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.108724 4774 generic.go:334] "Generic (PLEG): container finished" podID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerID="d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398" exitCode=0 Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.108776 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerDied","Data":"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398"} Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.108818 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-22gtx" event={"ID":"b99bd6c6-9930-4b06-92c3-6311d2113a89","Type":"ContainerDied","Data":"fc8a3cab5f20495e202fcd06bd1efd38133897f7371de4ff2a04e4e12345718e"} Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.108843 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-22gtx" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.108858 4774 scope.go:117] "RemoveContainer" containerID="d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.127759 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tl84h" podStartSLOduration=1.518667367 podStartE2EDuration="4.127741841s" podCreationTimestamp="2025-10-01 14:01:35 +0000 UTC" firstStartedPulling="2025-10-01 14:01:36.071502882 +0000 UTC m=+1467.961133489" lastFinishedPulling="2025-10-01 14:01:38.680577326 +0000 UTC m=+1470.570207963" observedRunningTime="2025-10-01 14:01:39.126663052 +0000 UTC m=+1471.016293669" watchObservedRunningTime="2025-10-01 14:01:39.127741841 +0000 UTC m=+1471.017372438" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.139050 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b99bd6c6-9930-4b06-92c3-6311d2113a89" (UID: "b99bd6c6-9930-4b06-92c3-6311d2113a89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.154407 4774 scope.go:117] "RemoveContainer" containerID="cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.184559 4774 scope.go:117] "RemoveContainer" containerID="d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.190274 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.190313 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn978\" (UniqueName: \"kubernetes.io/projected/b99bd6c6-9930-4b06-92c3-6311d2113a89-kube-api-access-cn978\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.190329 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99bd6c6-9930-4b06-92c3-6311d2113a89-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.215662 4774 scope.go:117] "RemoveContainer" containerID="d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398" Oct 01 14:01:39 crc kubenswrapper[4774]: E1001 14:01:39.216250 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398\": container with ID starting with d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398 not found: ID does not exist" containerID="d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.216312 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398"} err="failed to get container status \"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398\": rpc error: code = NotFound desc = could not find container \"d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398\": container with ID starting with d13d0c3d88077798e07340ce4571ea9a934a8a9cf854dea079609fdb8d431398 not found: ID does not exist" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.216348 4774 scope.go:117] "RemoveContainer" containerID="cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4" Oct 01 14:01:39 crc kubenswrapper[4774]: E1001 14:01:39.216886 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4\": container with ID starting with cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4 not found: ID does not exist" containerID="cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.216916 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4"} err="failed to get container status \"cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4\": rpc error: code = NotFound desc = could not find container \"cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4\": container with ID starting with cf0f6e3ad2a9f1f00f7ee13fb22e82d6454be707fe858e1abba8df0c444c9dd4 not found: ID does not exist" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.216935 4774 scope.go:117] "RemoveContainer" containerID="d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b" Oct 01 14:01:39 crc kubenswrapper[4774]: E1001 14:01:39.217323 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b\": container with ID starting with d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b not found: ID does not exist" containerID="d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.217352 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b"} err="failed to get container status \"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b\": rpc error: code = NotFound desc = could not find container \"d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b\": container with ID starting with d266ab461e22ead17ed1a1f5da72cd7f41d834649915a0f194cc8874ad32942b not found: ID does not exist" Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.438139 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:39 crc kubenswrapper[4774]: I1001 14:01:39.443378 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-22gtx"] Oct 01 14:01:40 crc kubenswrapper[4774]: I1001 14:01:40.885869 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" path="/var/lib/kubelet/pods/b99bd6c6-9930-4b06-92c3-6311d2113a89/volumes" Oct 01 14:01:45 crc kubenswrapper[4774]: I1001 14:01:45.499081 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:45 crc kubenswrapper[4774]: I1001 14:01:45.499848 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:45 crc kubenswrapper[4774]: I1001 14:01:45.575986 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:46 crc kubenswrapper[4774]: I1001 14:01:46.267919 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:46 crc kubenswrapper[4774]: I1001 14:01:46.325725 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.198363 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tl84h" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="registry-server" containerID="cri-o://9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5" gracePeriod=2 Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.694763 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.746363 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities\") pod \"40e41281-dfb2-4126-806d-92671b997dcc\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.746414 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content\") pod \"40e41281-dfb2-4126-806d-92671b997dcc\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.746621 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlqbk\" (UniqueName: \"kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk\") pod \"40e41281-dfb2-4126-806d-92671b997dcc\" (UID: \"40e41281-dfb2-4126-806d-92671b997dcc\") " Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.747513 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities" (OuterVolumeSpecName: "utilities") pod "40e41281-dfb2-4126-806d-92671b997dcc" (UID: "40e41281-dfb2-4126-806d-92671b997dcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.747878 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.758684 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40e41281-dfb2-4126-806d-92671b997dcc" (UID: "40e41281-dfb2-4126-806d-92671b997dcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.760597 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk" (OuterVolumeSpecName: "kube-api-access-wlqbk") pod "40e41281-dfb2-4126-806d-92671b997dcc" (UID: "40e41281-dfb2-4126-806d-92671b997dcc"). InnerVolumeSpecName "kube-api-access-wlqbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.849412 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e41281-dfb2-4126-806d-92671b997dcc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:48 crc kubenswrapper[4774]: I1001 14:01:48.849513 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlqbk\" (UniqueName: \"kubernetes.io/projected/40e41281-dfb2-4126-806d-92671b997dcc-kube-api-access-wlqbk\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.212165 4774 generic.go:334] "Generic (PLEG): container finished" podID="40e41281-dfb2-4126-806d-92671b997dcc" containerID="9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5" exitCode=0 Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.212169 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tl84h" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.212219 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerDied","Data":"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5"} Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.212314 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tl84h" event={"ID":"40e41281-dfb2-4126-806d-92671b997dcc","Type":"ContainerDied","Data":"1dab3930bfbba12e9a95dd76fe8eb84a6ae0aaa10804163a1eecbf3101a91259"} Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.212352 4774 scope.go:117] "RemoveContainer" containerID="9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.241528 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.243403 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tl84h"] Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.253419 4774 scope.go:117] "RemoveContainer" containerID="f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.287043 4774 scope.go:117] "RemoveContainer" containerID="8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.348641 4774 scope.go:117] "RemoveContainer" containerID="9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5" Oct 01 14:01:49 crc kubenswrapper[4774]: E1001 14:01:49.349405 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5\": container with ID starting with 9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5 not found: ID does not exist" containerID="9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.349434 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5"} err="failed to get container status \"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5\": rpc error: code = NotFound desc = could not find container \"9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5\": container with ID starting with 9c569505ceafb8b6266a1707a3164e16fd33653be9942cb68f20bbb764df63a5 not found: ID does not exist" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.349469 4774 scope.go:117] "RemoveContainer" containerID="f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a" Oct 01 14:01:49 crc kubenswrapper[4774]: E1001 14:01:49.350184 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a\": container with ID starting with f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a not found: ID does not exist" containerID="f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.350205 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a"} err="failed to get container status \"f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a\": rpc error: code = NotFound desc = could not find container \"f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a\": container with ID starting with f1ba3ee7974e19e6806636788c4bd860103edda423c4058fd873df0fbcc0ea9a not found: ID does not exist" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.350220 4774 scope.go:117] "RemoveContainer" containerID="8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a" Oct 01 14:01:49 crc kubenswrapper[4774]: E1001 14:01:49.350631 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a\": container with ID starting with 8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a not found: ID does not exist" containerID="8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a" Oct 01 14:01:49 crc kubenswrapper[4774]: I1001 14:01:49.350656 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a"} err="failed to get container status \"8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a\": rpc error: code = NotFound desc = could not find container \"8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a\": container with ID starting with 8a4d2ada0555ded136cd4388edd8a258185b5840a1291cfe92d70277e7aa6c2a not found: ID does not exist" Oct 01 14:01:50 crc kubenswrapper[4774]: I1001 14:01:50.885926 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e41281-dfb2-4126-806d-92671b997dcc" path="/var/lib/kubelet/pods/40e41281-dfb2-4126-806d-92671b997dcc/volumes" Oct 01 14:02:07 crc kubenswrapper[4774]: I1001 14:02:07.270791 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:02:07 crc kubenswrapper[4774]: I1001 14:02:07.271423 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:02:07 crc kubenswrapper[4774]: I1001 14:02:07.271501 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 14:02:07 crc kubenswrapper[4774]: I1001 14:02:07.272044 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:02:07 crc kubenswrapper[4774]: I1001 14:02:07.272128 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" gracePeriod=600 Oct 01 14:02:07 crc kubenswrapper[4774]: E1001 14:02:07.408168 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:02:08 crc kubenswrapper[4774]: I1001 14:02:08.384957 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" exitCode=0 Oct 01 14:02:08 crc kubenswrapper[4774]: I1001 14:02:08.385036 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b"} Oct 01 14:02:08 crc kubenswrapper[4774]: I1001 14:02:08.385121 4774 scope.go:117] "RemoveContainer" containerID="c15aef90c5355ee45eff7a2029dad852c4a145de45cfd5eb39c4d4e24c84668f" Oct 01 14:02:08 crc kubenswrapper[4774]: I1001 14:02:08.386314 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:02:08 crc kubenswrapper[4774]: E1001 14:02:08.386920 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:02:10 crc kubenswrapper[4774]: I1001 14:02:10.021372 4774 scope.go:117] "RemoveContainer" containerID="0c3dfde683efa4f010ac68fedec926e0d9ed8a48ae29cb0c17337512c3264761" Oct 01 14:02:10 crc kubenswrapper[4774]: I1001 14:02:10.051002 4774 scope.go:117] "RemoveContainer" containerID="28ee9994d431637ce0e99c802227d39ef9e343e7fe4a0181fc1b3fc31f7af1ae" Oct 01 14:02:10 crc kubenswrapper[4774]: I1001 14:02:10.110923 4774 scope.go:117] "RemoveContainer" containerID="6a5c9da08026208f38109311cc54b8d327d886f4db61859c0654741821c2bc2c" Oct 01 14:02:19 crc kubenswrapper[4774]: I1001 14:02:19.476118 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" exitCode=1 Oct 01 14:02:19 crc kubenswrapper[4774]: I1001 14:02:19.476309 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a"} Oct 01 14:02:19 crc kubenswrapper[4774]: I1001 14:02:19.476880 4774 scope.go:117] "RemoveContainer" containerID="57968a2cd255e91e38cd9fb079aa7c0607a9ec7789d35ff1c42de69285006e56" Oct 01 14:02:19 crc kubenswrapper[4774]: I1001 14:02:19.477754 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:02:19 crc kubenswrapper[4774]: E1001 14:02:19.478089 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:02:20 crc kubenswrapper[4774]: I1001 14:02:20.870620 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:02:20 crc kubenswrapper[4774]: E1001 14:02:20.871452 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:02:24 crc kubenswrapper[4774]: E1001 14:02:24.924320 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:02:25 crc kubenswrapper[4774]: I1001 14:02:25.545776 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:02:28 crc kubenswrapper[4774]: I1001 14:02:28.289860 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:02:28 crc kubenswrapper[4774]: I1001 14:02:28.291052 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:02:28 crc kubenswrapper[4774]: E1001 14:02:28.291504 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:02:29 crc kubenswrapper[4774]: I1001 14:02:29.943385 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:02:29 crc kubenswrapper[4774]: I1001 14:02:29.943493 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:02:29 crc kubenswrapper[4774]: E1001 14:02:29.943581 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:02:29 crc kubenswrapper[4774]: E1001 14:02:29.943666 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:04:31.943640072 +0000 UTC m=+1643.833270709 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:02:29 crc kubenswrapper[4774]: E1001 14:02:29.943769 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:02:29 crc kubenswrapper[4774]: E1001 14:02:29.943872 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:04:31.943846527 +0000 UTC m=+1643.833477144 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:02:35 crc kubenswrapper[4774]: I1001 14:02:35.870316 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:02:35 crc kubenswrapper[4774]: E1001 14:02:35.871273 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:02:38 crc kubenswrapper[4774]: I1001 14:02:38.289574 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:02:38 crc kubenswrapper[4774]: I1001 14:02:38.290805 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:02:38 crc kubenswrapper[4774]: E1001 14:02:38.291135 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:02:50 crc kubenswrapper[4774]: I1001 14:02:50.871570 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:02:50 crc kubenswrapper[4774]: E1001 14:02:50.872876 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:02:51 crc kubenswrapper[4774]: I1001 14:02:51.871149 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:02:51 crc kubenswrapper[4774]: E1001 14:02:51.871622 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:03:01 crc kubenswrapper[4774]: I1001 14:03:01.871500 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:03:01 crc kubenswrapper[4774]: E1001 14:03:01.874533 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:03:04 crc kubenswrapper[4774]: I1001 14:03:04.870290 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:03:05 crc kubenswrapper[4774]: I1001 14:03:05.918430 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3"} Oct 01 14:03:05 crc kubenswrapper[4774]: I1001 14:03:05.919181 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:03:10 crc kubenswrapper[4774]: I1001 14:03:10.246490 4774 scope.go:117] "RemoveContainer" containerID="f28457dd8a2f8ff9c9a8270b6b65bb2dea446bf883280976db1f4cc4a5dc61a9" Oct 01 14:03:10 crc kubenswrapper[4774]: I1001 14:03:10.291913 4774 scope.go:117] "RemoveContainer" containerID="388a166cb679ecb5604fb371659818bde230cd3e9580a6d71478ab3f557f73f6" Oct 01 14:03:10 crc kubenswrapper[4774]: I1001 14:03:10.326949 4774 scope.go:117] "RemoveContainer" containerID="665c6e5e8441c6a74bb84adfe7e1a6f3121c2d0340dd2fb9923c1364c7036fc1" Oct 01 14:03:10 crc kubenswrapper[4774]: I1001 14:03:10.360399 4774 scope.go:117] "RemoveContainer" containerID="dbfd9d7aa8103cf7ec614ec7c12818d1cad4498e1e67f0dfdf653effdbf4b2b7" Oct 01 14:03:14 crc kubenswrapper[4774]: I1001 14:03:14.870861 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:03:14 crc kubenswrapper[4774]: E1001 14:03:14.871675 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:03:18 crc kubenswrapper[4774]: I1001 14:03:18.298388 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:03:26 crc kubenswrapper[4774]: I1001 14:03:26.871331 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:03:26 crc kubenswrapper[4774]: E1001 14:03:26.872188 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:03:37 crc kubenswrapper[4774]: I1001 14:03:37.870965 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:03:37 crc kubenswrapper[4774]: E1001 14:03:37.872363 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:03:50 crc kubenswrapper[4774]: I1001 14:03:50.870648 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:03:50 crc kubenswrapper[4774]: E1001 14:03:50.871686 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:04:05 crc kubenswrapper[4774]: I1001 14:04:05.870440 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:04:05 crc kubenswrapper[4774]: E1001 14:04:05.871555 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:04:10 crc kubenswrapper[4774]: I1001 14:04:10.487002 4774 scope.go:117] "RemoveContainer" containerID="800992360e4ad4cb2b1652e58cdf8bf2e35adfe118b55abd740c79a2caf7e121" Oct 01 14:04:10 crc kubenswrapper[4774]: I1001 14:04:10.529646 4774 scope.go:117] "RemoveContainer" containerID="afed800a1b6c3c2802da4784906bd35b48a7e9e9c4346eabe3d908f6762949be" Oct 01 14:04:20 crc kubenswrapper[4774]: I1001 14:04:20.870887 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:04:20 crc kubenswrapper[4774]: E1001 14:04:20.871934 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:04:28 crc kubenswrapper[4774]: E1001 14:04:28.547538 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:04:28 crc kubenswrapper[4774]: I1001 14:04:28.674092 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:04:32 crc kubenswrapper[4774]: I1001 14:04:32.026985 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:04:32 crc kubenswrapper[4774]: I1001 14:04:32.027356 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:04:32 crc kubenswrapper[4774]: E1001 14:04:32.027244 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:04:32 crc kubenswrapper[4774]: E1001 14:04:32.027513 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:06:34.027483722 +0000 UTC m=+1765.917114349 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:04:32 crc kubenswrapper[4774]: E1001 14:04:32.027633 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:04:32 crc kubenswrapper[4774]: E1001 14:04:32.027717 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:06:34.027698478 +0000 UTC m=+1765.917329115 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:04:34 crc kubenswrapper[4774]: I1001 14:04:34.871311 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:04:34 crc kubenswrapper[4774]: E1001 14:04:34.872021 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:04:46 crc kubenswrapper[4774]: I1001 14:04:46.870636 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:04:46 crc kubenswrapper[4774]: E1001 14:04:46.873155 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:04:59 crc kubenswrapper[4774]: I1001 14:04:59.871742 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:04:59 crc kubenswrapper[4774]: E1001 14:04:59.872632 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:05:10 crc kubenswrapper[4774]: I1001 14:05:10.607612 4774 scope.go:117] "RemoveContainer" containerID="98f1d0e8be02556c92f44319b70c5e887e8674971422459c461b2fa18abcd67a" Oct 01 14:05:10 crc kubenswrapper[4774]: I1001 14:05:10.640057 4774 scope.go:117] "RemoveContainer" containerID="7ab8ec0a4a3e190d237dd706662b9d147cbfafece8b6790f9c7768ad527bebcd" Oct 01 14:05:12 crc kubenswrapper[4774]: I1001 14:05:12.871118 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:05:12 crc kubenswrapper[4774]: E1001 14:05:12.872362 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:05:23 crc kubenswrapper[4774]: I1001 14:05:23.152561 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" exitCode=1 Oct 01 14:05:23 crc kubenswrapper[4774]: I1001 14:05:23.152636 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3"} Oct 01 14:05:23 crc kubenswrapper[4774]: I1001 14:05:23.153515 4774 scope.go:117] "RemoveContainer" containerID="5a5ccee048afda8e92009159fc46f2750af9fe09be477a4e1a06830da99c3a1a" Oct 01 14:05:23 crc kubenswrapper[4774]: I1001 14:05:23.155025 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:05:23 crc kubenswrapper[4774]: E1001 14:05:23.155615 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:05:26 crc kubenswrapper[4774]: I1001 14:05:26.870913 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:05:26 crc kubenswrapper[4774]: E1001 14:05:26.872153 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:05:28 crc kubenswrapper[4774]: I1001 14:05:28.289318 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:05:28 crc kubenswrapper[4774]: I1001 14:05:28.290414 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:05:28 crc kubenswrapper[4774]: E1001 14:05:28.290972 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:05:38 crc kubenswrapper[4774]: I1001 14:05:38.291729 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:05:38 crc kubenswrapper[4774]: I1001 14:05:38.293125 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:05:38 crc kubenswrapper[4774]: E1001 14:05:38.293408 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:05:40 crc kubenswrapper[4774]: I1001 14:05:40.870588 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:05:40 crc kubenswrapper[4774]: E1001 14:05:40.871390 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:05:49 crc kubenswrapper[4774]: I1001 14:05:49.870635 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:05:49 crc kubenswrapper[4774]: E1001 14:05:49.871624 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:05:52 crc kubenswrapper[4774]: I1001 14:05:52.870990 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:05:52 crc kubenswrapper[4774]: E1001 14:05:52.871772 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:06:01 crc kubenswrapper[4774]: I1001 14:06:01.870734 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:06:01 crc kubenswrapper[4774]: E1001 14:06:01.871418 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:06:06 crc kubenswrapper[4774]: I1001 14:06:06.871021 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:06:06 crc kubenswrapper[4774]: E1001 14:06:06.872981 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:06:10 crc kubenswrapper[4774]: I1001 14:06:10.727331 4774 scope.go:117] "RemoveContainer" containerID="42f6587bee42e963fe8c4a76b48e7d56ffb5525da05823dc61d6053869639214" Oct 01 14:06:10 crc kubenswrapper[4774]: I1001 14:06:10.771600 4774 scope.go:117] "RemoveContainer" containerID="ad5a25121cb41282a5ee5c003cdb8b6486fa6ef864d9e86a35db1b7e11cd6b87" Oct 01 14:06:10 crc kubenswrapper[4774]: I1001 14:06:10.809018 4774 scope.go:117] "RemoveContainer" containerID="bfe56871649e1fe80d595d13c1de8aea42efc1993a9d539fecbe2dcc8165493e" Oct 01 14:06:10 crc kubenswrapper[4774]: I1001 14:06:10.831005 4774 scope.go:117] "RemoveContainer" containerID="0a99b08a95c7f6ad3095ca63d6e1a4ddfe9c2184ce6772538491c2af3cede00d" Oct 01 14:06:16 crc kubenswrapper[4774]: I1001 14:06:16.870034 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:06:16 crc kubenswrapper[4774]: E1001 14:06:16.870721 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:06:20 crc kubenswrapper[4774]: I1001 14:06:20.871307 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:06:20 crc kubenswrapper[4774]: E1001 14:06:20.872030 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:06:27 crc kubenswrapper[4774]: I1001 14:06:27.871681 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:06:27 crc kubenswrapper[4774]: E1001 14:06:27.872716 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:06:31 crc kubenswrapper[4774]: E1001 14:06:31.676189 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:06:31 crc kubenswrapper[4774]: I1001 14:06:31.764123 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:06:33 crc kubenswrapper[4774]: I1001 14:06:33.871027 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:06:33 crc kubenswrapper[4774]: E1001 14:06:33.871762 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:06:34 crc kubenswrapper[4774]: I1001 14:06:34.032062 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:06:34 crc kubenswrapper[4774]: I1001 14:06:34.032166 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:06:34 crc kubenswrapper[4774]: E1001 14:06:34.032402 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:06:34 crc kubenswrapper[4774]: E1001 14:06:34.032556 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:08:36.032527447 +0000 UTC m=+1887.922158074 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:06:34 crc kubenswrapper[4774]: E1001 14:06:34.032612 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:06:34 crc kubenswrapper[4774]: E1001 14:06:34.032747 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:08:36.03271457 +0000 UTC m=+1887.922345197 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:06:39 crc kubenswrapper[4774]: I1001 14:06:39.875608 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:06:39 crc kubenswrapper[4774]: E1001 14:06:39.877347 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:06:45 crc kubenswrapper[4774]: I1001 14:06:45.870241 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:06:45 crc kubenswrapper[4774]: E1001 14:06:45.873192 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:06:54 crc kubenswrapper[4774]: I1001 14:06:54.870585 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:06:55 crc kubenswrapper[4774]: I1001 14:06:55.962589 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5"} Oct 01 14:06:55 crc kubenswrapper[4774]: I1001 14:06:55.963437 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:06:57 crc kubenswrapper[4774]: I1001 14:06:57.870556 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:06:57 crc kubenswrapper[4774]: E1001 14:06:57.871250 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:07:08 crc kubenswrapper[4774]: I1001 14:07:08.293902 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:07:10 crc kubenswrapper[4774]: I1001 14:07:10.930308 4774 scope.go:117] "RemoveContainer" containerID="877bdc3a3ad8da2625d815d1333f32ae489849e04bffc9cc444df2b6463a72d4" Oct 01 14:07:12 crc kubenswrapper[4774]: I1001 14:07:12.870835 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:07:14 crc kubenswrapper[4774]: I1001 14:07:14.129374 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9"} Oct 01 14:08:34 crc kubenswrapper[4774]: E1001 14:08:34.766253 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:08:34 crc kubenswrapper[4774]: I1001 14:08:34.810250 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:08:36 crc kubenswrapper[4774]: I1001 14:08:36.066942 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:08:36 crc kubenswrapper[4774]: E1001 14:08:36.067176 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:08:36 crc kubenswrapper[4774]: I1001 14:08:36.067343 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:08:36 crc kubenswrapper[4774]: E1001 14:08:36.067445 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:10:38.067415396 +0000 UTC m=+2009.957046033 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:08:36 crc kubenswrapper[4774]: E1001 14:08:36.067480 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:08:36 crc kubenswrapper[4774]: E1001 14:08:36.067654 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:10:38.067630472 +0000 UTC m=+2009.957261169 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:09:12 crc kubenswrapper[4774]: I1001 14:09:12.141190 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" exitCode=1 Oct 01 14:09:12 crc kubenswrapper[4774]: I1001 14:09:12.141277 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5"} Oct 01 14:09:12 crc kubenswrapper[4774]: I1001 14:09:12.142393 4774 scope.go:117] "RemoveContainer" containerID="44d6e62cbb43231ccfe476cac1714d65b5ecf4715669c4807f3ec603d86f2af3" Oct 01 14:09:12 crc kubenswrapper[4774]: I1001 14:09:12.144561 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:09:12 crc kubenswrapper[4774]: E1001 14:09:12.145080 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:09:18 crc kubenswrapper[4774]: I1001 14:09:18.289955 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:09:18 crc kubenswrapper[4774]: I1001 14:09:18.290302 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:09:18 crc kubenswrapper[4774]: I1001 14:09:18.290826 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:09:18 crc kubenswrapper[4774]: E1001 14:09:18.291010 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:09:32 crc kubenswrapper[4774]: I1001 14:09:32.870816 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:09:32 crc kubenswrapper[4774]: E1001 14:09:32.872018 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:09:37 crc kubenswrapper[4774]: I1001 14:09:37.270877 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:09:37 crc kubenswrapper[4774]: I1001 14:09:37.271290 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:09:44 crc kubenswrapper[4774]: I1001 14:09:44.870583 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:09:44 crc kubenswrapper[4774]: E1001 14:09:44.871186 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:09:57 crc kubenswrapper[4774]: I1001 14:09:57.870762 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:09:57 crc kubenswrapper[4774]: E1001 14:09:57.871683 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:10:07 crc kubenswrapper[4774]: I1001 14:10:07.271140 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:10:07 crc kubenswrapper[4774]: I1001 14:10:07.272659 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:10:10 crc kubenswrapper[4774]: I1001 14:10:10.870525 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:10:10 crc kubenswrapper[4774]: E1001 14:10:10.871432 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:10:21 crc kubenswrapper[4774]: I1001 14:10:21.870109 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:10:21 crc kubenswrapper[4774]: E1001 14:10:21.870998 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:10:36 crc kubenswrapper[4774]: I1001 14:10:36.870666 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:10:36 crc kubenswrapper[4774]: E1001 14:10:36.871670 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.271440 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.271770 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.271893 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.272623 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.272784 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9" gracePeriod=600 Oct 01 14:10:37 crc kubenswrapper[4774]: E1001 14:10:37.812092 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.844445 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9" exitCode=0 Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.844575 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.844480 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9"} Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.844753 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerStarted","Data":"ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc"} Oct 01 14:10:37 crc kubenswrapper[4774]: I1001 14:10:37.844811 4774 scope.go:117] "RemoveContainer" containerID="5ca68a7761362392ad5c481292381e7ce676dee5f1d85b784ed1d447206be97b" Oct 01 14:10:38 crc kubenswrapper[4774]: I1001 14:10:38.157214 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:10:38 crc kubenswrapper[4774]: I1001 14:10:38.157608 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:10:38 crc kubenswrapper[4774]: E1001 14:10:38.157372 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:10:38 crc kubenswrapper[4774]: E1001 14:10:38.157736 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:12:40.157702139 +0000 UTC m=+2132.047332736 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:10:38 crc kubenswrapper[4774]: E1001 14:10:38.157781 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:10:38 crc kubenswrapper[4774]: E1001 14:10:38.157849 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:12:40.157828301 +0000 UTC m=+2132.047458938 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:10:47 crc kubenswrapper[4774]: I1001 14:10:47.871379 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:10:47 crc kubenswrapper[4774]: E1001 14:10:47.872535 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:11:02 crc kubenswrapper[4774]: I1001 14:11:02.870138 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:11:02 crc kubenswrapper[4774]: E1001 14:11:02.870913 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.960822 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961493 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="extract-content" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961508 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="extract-content" Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961526 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961534 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961550 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="extract-utilities" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961558 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="extract-utilities" Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961572 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961581 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961599 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="extract-utilities" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961607 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="extract-utilities" Oct 01 14:11:07 crc kubenswrapper[4774]: E1001 14:11:07.961622 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="extract-content" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961630 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="extract-content" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961785 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e41281-dfb2-4126-806d-92671b997dcc" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.961803 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99bd6c6-9930-4b06-92c3-6311d2113a89" containerName="registry-server" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.962958 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:07 crc kubenswrapper[4774]: I1001 14:11:07.972923 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.144682 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzzjd\" (UniqueName: \"kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.144733 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.144935 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.245723 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.245809 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.245874 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzjd\" (UniqueName: \"kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.246421 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.246418 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.270247 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzzjd\" (UniqueName: \"kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd\") pod \"community-operators-9wqkc\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.300081 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:08 crc kubenswrapper[4774]: I1001 14:11:08.812436 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:08 crc kubenswrapper[4774]: W1001 14:11:08.817587 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e590b7_a099_4aba_8ad9_fd38d24c1c56.slice/crio-b30538dbdf168c099b9ec4388d57d47d24f753b53ed702dfac7ea8e0b4add696 WatchSource:0}: Error finding container b30538dbdf168c099b9ec4388d57d47d24f753b53ed702dfac7ea8e0b4add696: Status 404 returned error can't find the container with id b30538dbdf168c099b9ec4388d57d47d24f753b53ed702dfac7ea8e0b4add696 Oct 01 14:11:09 crc kubenswrapper[4774]: I1001 14:11:09.096482 4774 generic.go:334] "Generic (PLEG): container finished" podID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerID="2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72" exitCode=0 Oct 01 14:11:09 crc kubenswrapper[4774]: I1001 14:11:09.096532 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerDied","Data":"2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72"} Oct 01 14:11:09 crc kubenswrapper[4774]: I1001 14:11:09.096560 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerStarted","Data":"b30538dbdf168c099b9ec4388d57d47d24f753b53ed702dfac7ea8e0b4add696"} Oct 01 14:11:09 crc kubenswrapper[4774]: I1001 14:11:09.105348 4774 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:11:11 crc kubenswrapper[4774]: I1001 14:11:11.110074 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerStarted","Data":"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5"} Oct 01 14:11:12 crc kubenswrapper[4774]: I1001 14:11:12.120753 4774 generic.go:334] "Generic (PLEG): container finished" podID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerID="b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5" exitCode=0 Oct 01 14:11:12 crc kubenswrapper[4774]: I1001 14:11:12.120812 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerDied","Data":"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5"} Oct 01 14:11:13 crc kubenswrapper[4774]: I1001 14:11:13.134520 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerStarted","Data":"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66"} Oct 01 14:11:13 crc kubenswrapper[4774]: I1001 14:11:13.153885 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wqkc" podStartSLOduration=2.585471429 podStartE2EDuration="6.153864993s" podCreationTimestamp="2025-10-01 14:11:07 +0000 UTC" firstStartedPulling="2025-10-01 14:11:09.105082808 +0000 UTC m=+2040.994713405" lastFinishedPulling="2025-10-01 14:11:12.673476362 +0000 UTC m=+2044.563106969" observedRunningTime="2025-10-01 14:11:13.152913228 +0000 UTC m=+2045.042543865" watchObservedRunningTime="2025-10-01 14:11:13.153864993 +0000 UTC m=+2045.043495600" Oct 01 14:11:17 crc kubenswrapper[4774]: I1001 14:11:17.871634 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:11:17 crc kubenswrapper[4774]: E1001 14:11:17.872336 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:11:18 crc kubenswrapper[4774]: I1001 14:11:18.300399 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:18 crc kubenswrapper[4774]: I1001 14:11:18.300794 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:18 crc kubenswrapper[4774]: I1001 14:11:18.373698 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:19 crc kubenswrapper[4774]: I1001 14:11:19.268581 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:19 crc kubenswrapper[4774]: I1001 14:11:19.335693 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:21 crc kubenswrapper[4774]: I1001 14:11:21.205160 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wqkc" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="registry-server" containerID="cri-o://480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66" gracePeriod=2 Oct 01 14:11:21 crc kubenswrapper[4774]: E1001 14:11:21.862556 4774 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e590b7_a099_4aba_8ad9_fd38d24c1c56.slice/crio-conmon-480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66.scope\": RecentStats: unable to find data in memory cache]" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.204844 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.213529 4774 generic.go:334] "Generic (PLEG): container finished" podID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerID="480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66" exitCode=0 Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.213582 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerDied","Data":"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66"} Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.213591 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wqkc" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.213622 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wqkc" event={"ID":"72e590b7-a099-4aba-8ad9-fd38d24c1c56","Type":"ContainerDied","Data":"b30538dbdf168c099b9ec4388d57d47d24f753b53ed702dfac7ea8e0b4add696"} Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.213648 4774 scope.go:117] "RemoveContainer" containerID="480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.233103 4774 scope.go:117] "RemoveContainer" containerID="b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.250361 4774 scope.go:117] "RemoveContainer" containerID="2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.270716 4774 scope.go:117] "RemoveContainer" containerID="480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66" Oct 01 14:11:22 crc kubenswrapper[4774]: E1001 14:11:22.271154 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66\": container with ID starting with 480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66 not found: ID does not exist" containerID="480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.271201 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66"} err="failed to get container status \"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66\": rpc error: code = NotFound desc = could not find container \"480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66\": container with ID starting with 480120552e87cc0c5c057817ab106b26406d16246dc946c66ca5846fec8ffb66 not found: ID does not exist" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.271229 4774 scope.go:117] "RemoveContainer" containerID="b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5" Oct 01 14:11:22 crc kubenswrapper[4774]: E1001 14:11:22.271688 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5\": container with ID starting with b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5 not found: ID does not exist" containerID="b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.271736 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5"} err="failed to get container status \"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5\": rpc error: code = NotFound desc = could not find container \"b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5\": container with ID starting with b1c69e7b6613c794f119e7b6deb3c0847dfede2aab3558f821c8b86f7f89b0e5 not found: ID does not exist" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.271763 4774 scope.go:117] "RemoveContainer" containerID="2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72" Oct 01 14:11:22 crc kubenswrapper[4774]: E1001 14:11:22.272002 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72\": container with ID starting with 2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72 not found: ID does not exist" containerID="2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.272029 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72"} err="failed to get container status \"2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72\": rpc error: code = NotFound desc = could not find container \"2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72\": container with ID starting with 2d738d5148b404372dd34d64f5d8ad4e1148412443f3ad755c5175c1820b6c72 not found: ID does not exist" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.384935 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities\") pod \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.385066 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzzjd\" (UniqueName: \"kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd\") pod \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.385097 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content\") pod \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\" (UID: \"72e590b7-a099-4aba-8ad9-fd38d24c1c56\") " Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.385913 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities" (OuterVolumeSpecName: "utilities") pod "72e590b7-a099-4aba-8ad9-fd38d24c1c56" (UID: "72e590b7-a099-4aba-8ad9-fd38d24c1c56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.393208 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd" (OuterVolumeSpecName: "kube-api-access-xzzjd") pod "72e590b7-a099-4aba-8ad9-fd38d24c1c56" (UID: "72e590b7-a099-4aba-8ad9-fd38d24c1c56"). InnerVolumeSpecName "kube-api-access-xzzjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.432535 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72e590b7-a099-4aba-8ad9-fd38d24c1c56" (UID: "72e590b7-a099-4aba-8ad9-fd38d24c1c56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.486874 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.486912 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e590b7-a099-4aba-8ad9-fd38d24c1c56-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.486926 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzzjd\" (UniqueName: \"kubernetes.io/projected/72e590b7-a099-4aba-8ad9-fd38d24c1c56-kube-api-access-xzzjd\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.552547 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.557531 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wqkc"] Oct 01 14:11:22 crc kubenswrapper[4774]: I1001 14:11:22.884498 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" path="/var/lib/kubelet/pods/72e590b7-a099-4aba-8ad9-fd38d24c1c56/volumes" Oct 01 14:11:29 crc kubenswrapper[4774]: I1001 14:11:29.870308 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:11:29 crc kubenswrapper[4774]: E1001 14:11:29.871305 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:11:41 crc kubenswrapper[4774]: I1001 14:11:41.871312 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:11:41 crc kubenswrapper[4774]: E1001 14:11:41.872495 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:11:53 crc kubenswrapper[4774]: I1001 14:11:53.870174 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:11:54 crc kubenswrapper[4774]: I1001 14:11:54.483648 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerStarted","Data":"af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65"} Oct 01 14:11:54 crc kubenswrapper[4774]: I1001 14:11:54.484499 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.202377 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:11:55 crc kubenswrapper[4774]: E1001 14:11:55.202882 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="extract-utilities" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.202907 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="extract-utilities" Oct 01 14:11:55 crc kubenswrapper[4774]: E1001 14:11:55.202949 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="registry-server" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.202965 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="registry-server" Oct 01 14:11:55 crc kubenswrapper[4774]: E1001 14:11:55.202982 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="extract-content" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.202994 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="extract-content" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.203230 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e590b7-a099-4aba-8ad9-fd38d24c1c56" containerName="registry-server" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.205025 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.222083 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.242963 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.243054 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xs5\" (UniqueName: \"kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.243121 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.344328 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.344476 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29xs5\" (UniqueName: \"kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.344580 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.345138 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.345156 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.373253 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xs5\" (UniqueName: \"kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5\") pod \"redhat-marketplace-79npf\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.520901 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:11:55 crc kubenswrapper[4774]: I1001 14:11:55.731571 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:11:55 crc kubenswrapper[4774]: W1001 14:11:55.736340 4774 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda403f9e8_2b9f_4f58_9ff2_e3bc46f939a3.slice/crio-85fc7f0977297b979384d3b9cb96148cf7842fec4f25e8b082a762783413fb9a WatchSource:0}: Error finding container 85fc7f0977297b979384d3b9cb96148cf7842fec4f25e8b082a762783413fb9a: Status 404 returned error can't find the container with id 85fc7f0977297b979384d3b9cb96148cf7842fec4f25e8b082a762783413fb9a Oct 01 14:11:56 crc kubenswrapper[4774]: I1001 14:11:56.500286 4774 generic.go:334] "Generic (PLEG): container finished" podID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerID="58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20" exitCode=0 Oct 01 14:11:56 crc kubenswrapper[4774]: I1001 14:11:56.500377 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerDied","Data":"58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20"} Oct 01 14:11:56 crc kubenswrapper[4774]: I1001 14:11:56.500432 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerStarted","Data":"85fc7f0977297b979384d3b9cb96148cf7842fec4f25e8b082a762783413fb9a"} Oct 01 14:11:57 crc kubenswrapper[4774]: I1001 14:11:57.511170 4774 generic.go:334] "Generic (PLEG): container finished" podID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerID="eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3" exitCode=0 Oct 01 14:11:57 crc kubenswrapper[4774]: I1001 14:11:57.511282 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerDied","Data":"eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3"} Oct 01 14:11:58 crc kubenswrapper[4774]: I1001 14:11:58.520521 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerStarted","Data":"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8"} Oct 01 14:11:58 crc kubenswrapper[4774]: I1001 14:11:58.546110 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79npf" podStartSLOduration=2.064437704 podStartE2EDuration="3.546081242s" podCreationTimestamp="2025-10-01 14:11:55 +0000 UTC" firstStartedPulling="2025-10-01 14:11:56.502985457 +0000 UTC m=+2088.392616084" lastFinishedPulling="2025-10-01 14:11:57.984628995 +0000 UTC m=+2089.874259622" observedRunningTime="2025-10-01 14:11:58.5399609 +0000 UTC m=+2090.429591527" watchObservedRunningTime="2025-10-01 14:11:58.546081242 +0000 UTC m=+2090.435711859" Oct 01 14:12:05 crc kubenswrapper[4774]: I1001 14:12:05.521315 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:05 crc kubenswrapper[4774]: I1001 14:12:05.521921 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:05 crc kubenswrapper[4774]: I1001 14:12:05.592161 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:05 crc kubenswrapper[4774]: I1001 14:12:05.663827 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:05 crc kubenswrapper[4774]: I1001 14:12:05.836730 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:12:07 crc kubenswrapper[4774]: I1001 14:12:07.585271 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79npf" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="registry-server" containerID="cri-o://4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8" gracePeriod=2 Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.062158 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.167536 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content\") pod \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.167582 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xs5\" (UniqueName: \"kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5\") pod \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.167708 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities\") pod \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\" (UID: \"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3\") " Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.169108 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities" (OuterVolumeSpecName: "utilities") pod "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" (UID: "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.176055 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5" (OuterVolumeSpecName: "kube-api-access-29xs5") pod "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" (UID: "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3"). InnerVolumeSpecName "kube-api-access-29xs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.190217 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" (UID: "a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.269580 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.269640 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.269662 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29xs5\" (UniqueName: \"kubernetes.io/projected/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3-kube-api-access-29xs5\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.298851 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.598334 4774 generic.go:334] "Generic (PLEG): container finished" podID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerID="4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8" exitCode=0 Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.598418 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerDied","Data":"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8"} Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.598691 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79npf" event={"ID":"a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3","Type":"ContainerDied","Data":"85fc7f0977297b979384d3b9cb96148cf7842fec4f25e8b082a762783413fb9a"} Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.598714 4774 scope.go:117] "RemoveContainer" containerID="4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.598446 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79npf" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.624639 4774 scope.go:117] "RemoveContainer" containerID="eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.658628 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.661657 4774 scope.go:117] "RemoveContainer" containerID="58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.667102 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79npf"] Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.698065 4774 scope.go:117] "RemoveContainer" containerID="4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8" Oct 01 14:12:08 crc kubenswrapper[4774]: E1001 14:12:08.698712 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8\": container with ID starting with 4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8 not found: ID does not exist" containerID="4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.698763 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8"} err="failed to get container status \"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8\": rpc error: code = NotFound desc = could not find container \"4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8\": container with ID starting with 4a22a932228d08cdc7b75c640a31af4a253fecf5ec79879a9788e0a78713fef8 not found: ID does not exist" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.698800 4774 scope.go:117] "RemoveContainer" containerID="eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3" Oct 01 14:12:08 crc kubenswrapper[4774]: E1001 14:12:08.702430 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3\": container with ID starting with eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3 not found: ID does not exist" containerID="eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.702542 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3"} err="failed to get container status \"eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3\": rpc error: code = NotFound desc = could not find container \"eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3\": container with ID starting with eed6a80043db5277790339a42d652185b2c389607a9a82c8837188e2f74decd3 not found: ID does not exist" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.702573 4774 scope.go:117] "RemoveContainer" containerID="58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20" Oct 01 14:12:08 crc kubenswrapper[4774]: E1001 14:12:08.703030 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20\": container with ID starting with 58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20 not found: ID does not exist" containerID="58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.703058 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20"} err="failed to get container status \"58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20\": rpc error: code = NotFound desc = could not find container \"58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20\": container with ID starting with 58cce6ab30382985e890325c91f212fcefa51468bae1cc6724d4d1fd1600bd20 not found: ID does not exist" Oct 01 14:12:08 crc kubenswrapper[4774]: I1001 14:12:08.888036 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" path="/var/lib/kubelet/pods/a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3/volumes" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.824247 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:25 crc kubenswrapper[4774]: E1001 14:12:25.826117 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="registry-server" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.826150 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="registry-server" Oct 01 14:12:25 crc kubenswrapper[4774]: E1001 14:12:25.826166 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="extract-content" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.826181 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="extract-content" Oct 01 14:12:25 crc kubenswrapper[4774]: E1001 14:12:25.826236 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="extract-utilities" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.826249 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="extract-utilities" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.826516 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="a403f9e8-2b9f-4f58-9ff2-e3bc46f939a3" containerName="registry-server" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.828401 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.841962 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.962473 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.962579 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:25 crc kubenswrapper[4774]: I1001 14:12:25.962802 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrdw\" (UniqueName: \"kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.064310 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.064383 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.064557 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrdw\" (UniqueName: \"kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.065127 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.065160 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.094356 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrdw\" (UniqueName: \"kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw\") pod \"certified-operators-kh9hp\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.154252 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.386875 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.748672 4774 generic.go:334] "Generic (PLEG): container finished" podID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerID="4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4" exitCode=0 Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.748725 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerDied","Data":"4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4"} Oct 01 14:12:26 crc kubenswrapper[4774]: I1001 14:12:26.748771 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerStarted","Data":"dbe874cde2a3b077d12128f9d823d03c574aa172c5ba09e9c0c1a54a6ded12e0"} Oct 01 14:12:27 crc kubenswrapper[4774]: I1001 14:12:27.761808 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerStarted","Data":"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8"} Oct 01 14:12:28 crc kubenswrapper[4774]: I1001 14:12:28.773754 4774 generic.go:334] "Generic (PLEG): container finished" podID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerID="8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8" exitCode=0 Oct 01 14:12:28 crc kubenswrapper[4774]: I1001 14:12:28.773956 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerDied","Data":"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8"} Oct 01 14:12:29 crc kubenswrapper[4774]: I1001 14:12:29.784688 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerStarted","Data":"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff"} Oct 01 14:12:29 crc kubenswrapper[4774]: I1001 14:12:29.810374 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kh9hp" podStartSLOduration=2.367403439 podStartE2EDuration="4.810349673s" podCreationTimestamp="2025-10-01 14:12:25 +0000 UTC" firstStartedPulling="2025-10-01 14:12:26.751551319 +0000 UTC m=+2118.641181916" lastFinishedPulling="2025-10-01 14:12:29.194497553 +0000 UTC m=+2121.084128150" observedRunningTime="2025-10-01 14:12:29.804995812 +0000 UTC m=+2121.694626419" watchObservedRunningTime="2025-10-01 14:12:29.810349673 +0000 UTC m=+2121.699980320" Oct 01 14:12:36 crc kubenswrapper[4774]: I1001 14:12:36.154988 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:36 crc kubenswrapper[4774]: I1001 14:12:36.156739 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:36 crc kubenswrapper[4774]: I1001 14:12:36.240786 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:36 crc kubenswrapper[4774]: I1001 14:12:36.903337 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:36 crc kubenswrapper[4774]: I1001 14:12:36.974574 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:37 crc kubenswrapper[4774]: I1001 14:12:37.271180 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:12:37 crc kubenswrapper[4774]: I1001 14:12:37.271266 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:12:38 crc kubenswrapper[4774]: I1001 14:12:38.859603 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kh9hp" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="registry-server" containerID="cri-o://98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff" gracePeriod=2 Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.327065 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.383530 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content\") pod \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.383584 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncrdw\" (UniqueName: \"kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw\") pod \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.383641 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities\") pod \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\" (UID: \"591ce1a4-d455-4d40-a6af-e87abff8fc2c\") " Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.384671 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities" (OuterVolumeSpecName: "utilities") pod "591ce1a4-d455-4d40-a6af-e87abff8fc2c" (UID: "591ce1a4-d455-4d40-a6af-e87abff8fc2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.390966 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw" (OuterVolumeSpecName: "kube-api-access-ncrdw") pod "591ce1a4-d455-4d40-a6af-e87abff8fc2c" (UID: "591ce1a4-d455-4d40-a6af-e87abff8fc2c"). InnerVolumeSpecName "kube-api-access-ncrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.445625 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "591ce1a4-d455-4d40-a6af-e87abff8fc2c" (UID: "591ce1a4-d455-4d40-a6af-e87abff8fc2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.485602 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.485657 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncrdw\" (UniqueName: \"kubernetes.io/projected/591ce1a4-d455-4d40-a6af-e87abff8fc2c-kube-api-access-ncrdw\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.485678 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/591ce1a4-d455-4d40-a6af-e87abff8fc2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.873647 4774 generic.go:334] "Generic (PLEG): container finished" podID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerID="98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff" exitCode=0 Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.873716 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kh9hp" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.873723 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerDied","Data":"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff"} Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.873956 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kh9hp" event={"ID":"591ce1a4-d455-4d40-a6af-e87abff8fc2c","Type":"ContainerDied","Data":"dbe874cde2a3b077d12128f9d823d03c574aa172c5ba09e9c0c1a54a6ded12e0"} Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.874008 4774 scope.go:117] "RemoveContainer" containerID="98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.912519 4774 scope.go:117] "RemoveContainer" containerID="8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.938195 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.949213 4774 scope.go:117] "RemoveContainer" containerID="4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.950346 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kh9hp"] Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.990867 4774 scope.go:117] "RemoveContainer" containerID="98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff" Oct 01 14:12:39 crc kubenswrapper[4774]: E1001 14:12:39.991434 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff\": container with ID starting with 98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff not found: ID does not exist" containerID="98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.991524 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff"} err="failed to get container status \"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff\": rpc error: code = NotFound desc = could not find container \"98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff\": container with ID starting with 98b608f0695461a869c34f088ab7ceaf2d3b283547f15b02b952dd15823021ff not found: ID does not exist" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.991565 4774 scope.go:117] "RemoveContainer" containerID="8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8" Oct 01 14:12:39 crc kubenswrapper[4774]: E1001 14:12:39.992134 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8\": container with ID starting with 8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8 not found: ID does not exist" containerID="8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.992174 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8"} err="failed to get container status \"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8\": rpc error: code = NotFound desc = could not find container \"8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8\": container with ID starting with 8df1f915a08091c29d8996f824257efebb225104b45cd3b93c8f85511ba210f8 not found: ID does not exist" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.992202 4774 scope.go:117] "RemoveContainer" containerID="4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4" Oct 01 14:12:39 crc kubenswrapper[4774]: E1001 14:12:39.992704 4774 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4\": container with ID starting with 4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4 not found: ID does not exist" containerID="4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4" Oct 01 14:12:39 crc kubenswrapper[4774]: I1001 14:12:39.992750 4774 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4"} err="failed to get container status \"4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4\": rpc error: code = NotFound desc = could not find container \"4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4\": container with ID starting with 4703604796d1cc4eb36a238622773d1bc7c8890dd30f1e0231a84f306ba92ae4 not found: ID does not exist" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.197962 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.198035 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") pod \"openstackclient\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " pod="keystone-kuttl-tests/openstackclient" Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.198112 4774 configmap.go:193] Couldn't get configMap keystone-kuttl-tests/openstack-config: configmap "openstack-config" not found Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.198180 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:14:42.198162251 +0000 UTC m=+2254.087792858 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : configmap "openstack-config" not found Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.198344 4774 secret.go:188] Couldn't get secret keystone-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.198483 4774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret podName:55d7ca53-90d2-4ad6-a72e-7b4618acbf42 nodeName:}" failed. No retries permitted until 2025-10-01 14:14:42.198427018 +0000 UTC m=+2254.088057645 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret") pod "openstackclient" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42") : secret "openstack-config-secret" not found Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.700586 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.700964 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="registry-server" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.700991 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="registry-server" Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.701039 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="extract-content" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.701053 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="extract-content" Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.701075 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="extract-utilities" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.701088 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="extract-utilities" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.701313 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" containerName="registry-server" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.702967 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.733038 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.809107 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvkx4\" (UniqueName: \"kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.809180 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.809346 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: E1001 14:12:40.845909 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.878527 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591ce1a4-d455-4d40-a6af-e87abff8fc2c" path="/var/lib/kubelet/pods/591ce1a4-d455-4d40-a6af-e87abff8fc2c/volumes" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.880989 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.910901 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.911026 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvkx4\" (UniqueName: \"kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.911053 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.911759 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.911772 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:40 crc kubenswrapper[4774]: I1001 14:12:40.941754 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvkx4\" (UniqueName: \"kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4\") pod \"redhat-operators-ph47s\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:41 crc kubenswrapper[4774]: I1001 14:12:41.036089 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:41 crc kubenswrapper[4774]: I1001 14:12:41.271101 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:41 crc kubenswrapper[4774]: I1001 14:12:41.888110 4774 generic.go:334] "Generic (PLEG): container finished" podID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerID="850710bdbf960e6fb34293547cfdc2a86b7c8fd9a747467f2492296c6e8eecf3" exitCode=0 Oct 01 14:12:41 crc kubenswrapper[4774]: I1001 14:12:41.888167 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerDied","Data":"850710bdbf960e6fb34293547cfdc2a86b7c8fd9a747467f2492296c6e8eecf3"} Oct 01 14:12:41 crc kubenswrapper[4774]: I1001 14:12:41.888389 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerStarted","Data":"6b7489981b6eca209aa0680f7b6b2bcd6d55704f6a77dee00a1ef49e1101af93"} Oct 01 14:12:42 crc kubenswrapper[4774]: I1001 14:12:42.904936 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerStarted","Data":"6c77a5a51965d261bc2324c4463d4582d29b55054fbaa0a5c813bebc68ac9af6"} Oct 01 14:12:43 crc kubenswrapper[4774]: I1001 14:12:43.919917 4774 generic.go:334] "Generic (PLEG): container finished" podID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerID="6c77a5a51965d261bc2324c4463d4582d29b55054fbaa0a5c813bebc68ac9af6" exitCode=0 Oct 01 14:12:43 crc kubenswrapper[4774]: I1001 14:12:43.920001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerDied","Data":"6c77a5a51965d261bc2324c4463d4582d29b55054fbaa0a5c813bebc68ac9af6"} Oct 01 14:12:44 crc kubenswrapper[4774]: I1001 14:12:44.928045 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerStarted","Data":"1737d6886da745d3e3d4e27f3fec012f990025bc5e66bcae4732676f0774837c"} Oct 01 14:12:44 crc kubenswrapper[4774]: I1001 14:12:44.957932 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ph47s" podStartSLOduration=2.41247928 podStartE2EDuration="4.957917294s" podCreationTimestamp="2025-10-01 14:12:40 +0000 UTC" firstStartedPulling="2025-10-01 14:12:41.889571868 +0000 UTC m=+2133.779202465" lastFinishedPulling="2025-10-01 14:12:44.435009872 +0000 UTC m=+2136.324640479" observedRunningTime="2025-10-01 14:12:44.955386353 +0000 UTC m=+2136.845016950" watchObservedRunningTime="2025-10-01 14:12:44.957917294 +0000 UTC m=+2136.847547891" Oct 01 14:12:51 crc kubenswrapper[4774]: I1001 14:12:51.037584 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:51 crc kubenswrapper[4774]: I1001 14:12:51.038365 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:51 crc kubenswrapper[4774]: I1001 14:12:51.113432 4774 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:52 crc kubenswrapper[4774]: I1001 14:12:52.072624 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:52 crc kubenswrapper[4774]: I1001 14:12:52.138674 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:54 crc kubenswrapper[4774]: I1001 14:12:54.015765 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ph47s" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="registry-server" containerID="cri-o://1737d6886da745d3e3d4e27f3fec012f990025bc5e66bcae4732676f0774837c" gracePeriod=2 Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.027518 4774 generic.go:334] "Generic (PLEG): container finished" podID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerID="1737d6886da745d3e3d4e27f3fec012f990025bc5e66bcae4732676f0774837c" exitCode=0 Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.027558 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerDied","Data":"1737d6886da745d3e3d4e27f3fec012f990025bc5e66bcae4732676f0774837c"} Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.600815 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.661569 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities\") pod \"425b89f6-2273-4b69-8a0a-5c5163d37d70\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.661682 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content\") pod \"425b89f6-2273-4b69-8a0a-5c5163d37d70\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.661728 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvkx4\" (UniqueName: \"kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4\") pod \"425b89f6-2273-4b69-8a0a-5c5163d37d70\" (UID: \"425b89f6-2273-4b69-8a0a-5c5163d37d70\") " Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.663245 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities" (OuterVolumeSpecName: "utilities") pod "425b89f6-2273-4b69-8a0a-5c5163d37d70" (UID: "425b89f6-2273-4b69-8a0a-5c5163d37d70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.673821 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4" (OuterVolumeSpecName: "kube-api-access-mvkx4") pod "425b89f6-2273-4b69-8a0a-5c5163d37d70" (UID: "425b89f6-2273-4b69-8a0a-5c5163d37d70"). InnerVolumeSpecName "kube-api-access-mvkx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.756076 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "425b89f6-2273-4b69-8a0a-5c5163d37d70" (UID: "425b89f6-2273-4b69-8a0a-5c5163d37d70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.763291 4774 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.763325 4774 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/425b89f6-2273-4b69-8a0a-5c5163d37d70-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:55 crc kubenswrapper[4774]: I1001 14:12:55.763336 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvkx4\" (UniqueName: \"kubernetes.io/projected/425b89f6-2273-4b69-8a0a-5c5163d37d70-kube-api-access-mvkx4\") on node \"crc\" DevicePath \"\"" Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.040785 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph47s" event={"ID":"425b89f6-2273-4b69-8a0a-5c5163d37d70","Type":"ContainerDied","Data":"6b7489981b6eca209aa0680f7b6b2bcd6d55704f6a77dee00a1ef49e1101af93"} Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.040884 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph47s" Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.041765 4774 scope.go:117] "RemoveContainer" containerID="1737d6886da745d3e3d4e27f3fec012f990025bc5e66bcae4732676f0774837c" Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.073041 4774 scope.go:117] "RemoveContainer" containerID="6c77a5a51965d261bc2324c4463d4582d29b55054fbaa0a5c813bebc68ac9af6" Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.098190 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.105525 4774 scope.go:117] "RemoveContainer" containerID="850710bdbf960e6fb34293547cfdc2a86b7c8fd9a747467f2492296c6e8eecf3" Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.107067 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ph47s"] Oct 01 14:12:56 crc kubenswrapper[4774]: I1001 14:12:56.884088 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" path="/var/lib/kubelet/pods/425b89f6-2273-4b69-8a0a-5c5163d37d70/volumes" Oct 01 14:13:07 crc kubenswrapper[4774]: I1001 14:13:07.271525 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:13:07 crc kubenswrapper[4774]: I1001 14:13:07.272189 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:13:37 crc kubenswrapper[4774]: I1001 14:13:37.271173 4774 patch_prober.go:28] interesting pod/machine-config-daemon-74ttd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:13:37 crc kubenswrapper[4774]: I1001 14:13:37.272127 4774 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:13:37 crc kubenswrapper[4774]: I1001 14:13:37.272212 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" Oct 01 14:13:37 crc kubenswrapper[4774]: I1001 14:13:37.273281 4774 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc"} pod="openshift-machine-config-operator/machine-config-daemon-74ttd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:13:37 crc kubenswrapper[4774]: I1001 14:13:37.273398 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" containerName="machine-config-daemon" containerID="cri-o://ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" gracePeriod=600 Oct 01 14:13:37 crc kubenswrapper[4774]: E1001 14:13:37.404570 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:13:38 crc kubenswrapper[4774]: I1001 14:13:38.407342 4774 generic.go:334] "Generic (PLEG): container finished" podID="18618ab0-7244-42b3-9ccd-60661c89c742" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" exitCode=0 Oct 01 14:13:38 crc kubenswrapper[4774]: I1001 14:13:38.407379 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" event={"ID":"18618ab0-7244-42b3-9ccd-60661c89c742","Type":"ContainerDied","Data":"ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc"} Oct 01 14:13:38 crc kubenswrapper[4774]: I1001 14:13:38.407409 4774 scope.go:117] "RemoveContainer" containerID="1fdc151e89abcbe989d77b2d954c973287de8134a90ac76ae4da50547968aca9" Oct 01 14:13:38 crc kubenswrapper[4774]: I1001 14:13:38.408010 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:13:38 crc kubenswrapper[4774]: E1001 14:13:38.408289 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:13:50 crc kubenswrapper[4774]: I1001 14:13:50.674345 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 01 14:13:50 crc kubenswrapper[4774]: E1001 14:13:50.675562 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="keystone-kuttl-tests/openstackclient" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" Oct 01 14:13:50 crc kubenswrapper[4774]: I1001 14:13:50.872354 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:13:50 crc kubenswrapper[4774]: E1001 14:13:50.872946 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:13:51 crc kubenswrapper[4774]: I1001 14:13:51.507343 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:13:51 crc kubenswrapper[4774]: I1001 14:13:51.519172 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:13:51 crc kubenswrapper[4774]: I1001 14:13:51.710019 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgzvf\" (UniqueName: \"kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf\") pod \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\" (UID: \"55d7ca53-90d2-4ad6-a72e-7b4618acbf42\") " Oct 01 14:13:51 crc kubenswrapper[4774]: I1001 14:13:51.723339 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf" (OuterVolumeSpecName: "kube-api-access-cgzvf") pod "55d7ca53-90d2-4ad6-a72e-7b4618acbf42" (UID: "55d7ca53-90d2-4ad6-a72e-7b4618acbf42"). InnerVolumeSpecName "kube-api-access-cgzvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:13:51 crc kubenswrapper[4774]: I1001 14:13:51.812187 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgzvf\" (UniqueName: \"kubernetes.io/projected/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-kube-api-access-cgzvf\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.514403 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keystone-kuttl-tests/openstackclient" Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.561811 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.566662 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["keystone-kuttl-tests/openstackclient"] Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.625378 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.625434 4774 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/55d7ca53-90d2-4ad6-a72e-7b4618acbf42-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:13:52 crc kubenswrapper[4774]: I1001 14:13:52.886647 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d7ca53-90d2-4ad6-a72e-7b4618acbf42" path="/var/lib/kubelet/pods/55d7ca53-90d2-4ad6-a72e-7b4618acbf42/volumes" Oct 01 14:14:04 crc kubenswrapper[4774]: I1001 14:14:04.870677 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:14:04 crc kubenswrapper[4774]: E1001 14:14:04.871610 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.247240 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kpntp/must-gather-jjm5w"] Oct 01 14:14:07 crc kubenswrapper[4774]: E1001 14:14:07.248143 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="registry-server" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.248163 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="registry-server" Oct 01 14:14:07 crc kubenswrapper[4774]: E1001 14:14:07.248195 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="extract-utilities" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.248203 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="extract-utilities" Oct 01 14:14:07 crc kubenswrapper[4774]: E1001 14:14:07.248217 4774 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="extract-content" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.248225 4774 state_mem.go:107] "Deleted CPUSet assignment" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="extract-content" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.248418 4774 memory_manager.go:354] "RemoveStaleState removing state" podUID="425b89f6-2273-4b69-8a0a-5c5163d37d70" containerName="registry-server" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.249301 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.254106 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kpntp"/"default-dockercfg-m4m66" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.260220 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kpntp/must-gather-jjm5w"] Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.262252 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kpntp"/"kube-root-ca.crt" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.264103 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kpntp"/"openshift-service-ca.crt" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.371926 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.371978 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kthvw\" (UniqueName: \"kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.472846 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.472914 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kthvw\" (UniqueName: \"kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.473688 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.493402 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kthvw\" (UniqueName: \"kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw\") pod \"must-gather-jjm5w\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.571536 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:14:07 crc kubenswrapper[4774]: I1001 14:14:07.778253 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kpntp/must-gather-jjm5w"] Oct 01 14:14:08 crc kubenswrapper[4774]: I1001 14:14:08.634888 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kpntp/must-gather-jjm5w" event={"ID":"e0e3def3-e253-47b3-a14d-6e3688e79870","Type":"ContainerStarted","Data":"1288c6ebd5692dadc55c1cb85c5b2da9e2319cc29d63747664cf2e968efb3ce1"} Oct 01 14:14:11 crc kubenswrapper[4774]: I1001 14:14:11.663839 4774 generic.go:334] "Generic (PLEG): container finished" podID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" exitCode=1 Oct 01 14:14:11 crc kubenswrapper[4774]: I1001 14:14:11.663907 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" event={"ID":"fe84b77c-3e6a-4244-8ef5-c6747459fabc","Type":"ContainerDied","Data":"af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65"} Oct 01 14:14:11 crc kubenswrapper[4774]: I1001 14:14:11.663954 4774 scope.go:117] "RemoveContainer" containerID="d467704c9da1132049bfdcec3b4a7cdca25586c10cb5bf4678cd32583cb89fe5" Oct 01 14:14:11 crc kubenswrapper[4774]: I1001 14:14:11.664887 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:14:11 crc kubenswrapper[4774]: E1001 14:14:11.665250 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:14:13 crc kubenswrapper[4774]: I1001 14:14:13.698531 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kpntp/must-gather-jjm5w" event={"ID":"e0e3def3-e253-47b3-a14d-6e3688e79870","Type":"ContainerStarted","Data":"1daa5ab06e13634a92e6fb39ca41d1d15e390319543cd2da43dbf213475ea00a"} Oct 01 14:14:13 crc kubenswrapper[4774]: I1001 14:14:13.698942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kpntp/must-gather-jjm5w" event={"ID":"e0e3def3-e253-47b3-a14d-6e3688e79870","Type":"ContainerStarted","Data":"683d34fa33c9a007701d8a86c5101b56ea014ac11e6f1b745cefe0f376d64d3b"} Oct 01 14:14:13 crc kubenswrapper[4774]: I1001 14:14:13.718744 4774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kpntp/must-gather-jjm5w" podStartSLOduration=2.034288964 podStartE2EDuration="6.718729122s" podCreationTimestamp="2025-10-01 14:14:07 +0000 UTC" firstStartedPulling="2025-10-01 14:14:07.786857398 +0000 UTC m=+2219.676487995" lastFinishedPulling="2025-10-01 14:14:12.471297516 +0000 UTC m=+2224.360928153" observedRunningTime="2025-10-01 14:14:13.715016595 +0000 UTC m=+2225.604647192" watchObservedRunningTime="2025-10-01 14:14:13.718729122 +0000 UTC m=+2225.608359709" Oct 01 14:14:15 crc kubenswrapper[4774]: I1001 14:14:15.870617 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:14:15 crc kubenswrapper[4774]: E1001 14:14:15.871113 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:14:18 crc kubenswrapper[4774]: I1001 14:14:18.289446 4774 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:14:18 crc kubenswrapper[4774]: I1001 14:14:18.289762 4774 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" Oct 01 14:14:18 crc kubenswrapper[4774]: I1001 14:14:18.290356 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:14:18 crc kubenswrapper[4774]: E1001 14:14:18.290613 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:14:27 crc kubenswrapper[4774]: I1001 14:14:27.870173 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:14:27 crc kubenswrapper[4774]: E1001 14:14:27.871160 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:14:29 crc kubenswrapper[4774]: I1001 14:14:29.870379 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:14:29 crc kubenswrapper[4774]: E1001 14:14:29.870755 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:14:42 crc kubenswrapper[4774]: I1001 14:14:42.870690 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:14:42 crc kubenswrapper[4774]: E1001 14:14:42.871353 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:14:43 crc kubenswrapper[4774]: I1001 14:14:43.871008 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:14:43 crc kubenswrapper[4774]: E1001 14:14:43.871541 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.506902 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/util/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.655064 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/util/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.682101 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/pull/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.688881 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/pull/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.848382 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/util/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.866737 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/extract/0.log" Oct 01 14:14:46 crc kubenswrapper[4774]: I1001 14:14:46.871416 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fe17e290923827dd9fb37b3e81441463184703be1419d7456093e6c7expsb5_6a01fafe-bffc-4df2-93bc-43dbc2c424ff/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.018543 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/util/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.232880 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.291118 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/util/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.292781 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.421285 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/util/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.433498 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.440863 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5909v9jg_64e22a11-410d-4091-bee5-f6d2ab9baa83/extract/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.589459 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/util/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.825230 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.825915 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/pull/0.log" Oct 01 14:14:47 crc kubenswrapper[4774]: I1001 14:14:47.840594 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/util/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.016280 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/extract/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.022931 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/pull/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.025978 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_e7e08b4ef4c4210da6849e65893e25f02a2f1e5ad24c8e4d88ab10670f4zdfq_a72155a5-47d5-48da-9b7c-e5b36d579a9d/util/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.171505 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/util/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.354957 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/pull/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.355406 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/pull/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.388424 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/util/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.539126 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/extract/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.543426 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/pull/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.556634 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fe5824c023aabbb502546c854372a2e601abaa9e5f2db222166b4342b0ctqk7_0ce70e53-d2d6-45ca-a7f8-ff0c5803c2bf/util/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.716385 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dc4785855-lh6lt_c2f00bec-2d63-49db-94dc-82a40bd0857c/kube-rbac-proxy/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.759549 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dc4785855-lh6lt_c2f00bec-2d63-49db-94dc-82a40bd0857c/manager/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.790558 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-7cfsk_6d988a79-de91-4635-b12e-5bd8b0705e36/registry-server/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.908004 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d9d9bb4b5-fr745_fe84b77c-3e6a-4244-8ef5-c6747459fabc/kube-rbac-proxy/0.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.925316 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d9d9bb4b5-fr745_fe84b77c-3e6a-4244-8ef5-c6747459fabc/manager/6.log" Oct 01 14:14:48 crc kubenswrapper[4774]: I1001 14:14:48.968962 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7d9d9bb4b5-fr745_fe84b77c-3e6a-4244-8ef5-c6747459fabc/manager/6.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.086316 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-z58lt_bcac7839-f573-479d-8139-21163dd1fd20/registry-server/0.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.139667 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-566896bb75-2m2bz_479f4868-5316-4fbf-bb7e-dd89de941340/kube-rbac-proxy/0.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.258108 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-566896bb75-2m2bz_479f4868-5316-4fbf-bb7e-dd89de941340/manager/0.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.281297 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-2wbpw_e2809b6a-b3bf-475f-8d9c-1f8609109e17/registry-server/0.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.429816 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-d2hdg_0a1bac53-f36d-4a76-a0c6-b19a17eb25f4/operator/0.log" Oct 01 14:14:49 crc kubenswrapper[4774]: I1001 14:14:49.477340 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-l72d5_1fc5545f-e6d0-4cd1-9abf-44138f6dc054/registry-server/0.log" Oct 01 14:14:57 crc kubenswrapper[4774]: I1001 14:14:57.871007 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:14:57 crc kubenswrapper[4774]: I1001 14:14:57.871849 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:14:57 crc kubenswrapper[4774]: E1001 14:14:57.872061 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:14:57 crc kubenswrapper[4774]: E1001 14:14:57.872178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.138606 4774 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh"] Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.139335 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.141533 4774 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.143671 4774 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.150724 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh"] Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.223122 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.223576 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dwr\" (UniqueName: \"kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.223665 4774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.324508 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dwr\" (UniqueName: \"kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.324583 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.324687 4774 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.326812 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.331857 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.341213 4774 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dwr\" (UniqueName: \"kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr\") pod \"collect-profiles-29322135-f7xrh\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.457535 4774 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:00 crc kubenswrapper[4774]: I1001 14:15:00.656397 4774 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh"] Oct 01 14:15:01 crc kubenswrapper[4774]: I1001 14:15:01.025814 4774 generic.go:334] "Generic (PLEG): container finished" podID="40ed497a-4644-41c1-b041-b5fa8d95b5de" containerID="1692ffc32516c68416c44237057a93b494e5499865ec32bd7c016831d82f3167" exitCode=0 Oct 01 14:15:01 crc kubenswrapper[4774]: I1001 14:15:01.025884 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" event={"ID":"40ed497a-4644-41c1-b041-b5fa8d95b5de","Type":"ContainerDied","Data":"1692ffc32516c68416c44237057a93b494e5499865ec32bd7c016831d82f3167"} Oct 01 14:15:01 crc kubenswrapper[4774]: I1001 14:15:01.025942 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" event={"ID":"40ed497a-4644-41c1-b041-b5fa8d95b5de","Type":"ContainerStarted","Data":"e3c40a39408855d16556cfd58ffbe7a56d8f28482bedee2126bb66e391e653fe"} Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.274896 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.452165 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8dwr\" (UniqueName: \"kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr\") pod \"40ed497a-4644-41c1-b041-b5fa8d95b5de\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.452530 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume\") pod \"40ed497a-4644-41c1-b041-b5fa8d95b5de\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.452627 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume\") pod \"40ed497a-4644-41c1-b041-b5fa8d95b5de\" (UID: \"40ed497a-4644-41c1-b041-b5fa8d95b5de\") " Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.453576 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume" (OuterVolumeSpecName: "config-volume") pod "40ed497a-4644-41c1-b041-b5fa8d95b5de" (UID: "40ed497a-4644-41c1-b041-b5fa8d95b5de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.461749 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40ed497a-4644-41c1-b041-b5fa8d95b5de" (UID: "40ed497a-4644-41c1-b041-b5fa8d95b5de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.462084 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr" (OuterVolumeSpecName: "kube-api-access-g8dwr") pod "40ed497a-4644-41c1-b041-b5fa8d95b5de" (UID: "40ed497a-4644-41c1-b041-b5fa8d95b5de"). InnerVolumeSpecName "kube-api-access-g8dwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.554341 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8dwr\" (UniqueName: \"kubernetes.io/projected/40ed497a-4644-41c1-b041-b5fa8d95b5de-kube-api-access-g8dwr\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.554387 4774 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40ed497a-4644-41c1-b041-b5fa8d95b5de-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:02 crc kubenswrapper[4774]: I1001 14:15:02.554400 4774 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40ed497a-4644-41c1-b041-b5fa8d95b5de-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.047001 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" event={"ID":"40ed497a-4644-41c1-b041-b5fa8d95b5de","Type":"ContainerDied","Data":"e3c40a39408855d16556cfd58ffbe7a56d8f28482bedee2126bb66e391e653fe"} Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.047038 4774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3c40a39408855d16556cfd58ffbe7a56d8f28482bedee2126bb66e391e653fe" Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.047090 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-f7xrh" Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.127032 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zwfjn_91e70912-55cd-44d4-be6f-b6c637bec430/control-plane-machine-set-operator/0.log" Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.330046 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7crv_22f1d9a2-5cf9-43f9-bd4e-822382f55a7c/kube-rbac-proxy/0.log" Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.366034 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht"] Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.382865 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-dqdht"] Oct 01 14:15:03 crc kubenswrapper[4774]: I1001 14:15:03.436350 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7crv_22f1d9a2-5cf9-43f9-bd4e-822382f55a7c/machine-api-operator/0.log" Oct 01 14:15:04 crc kubenswrapper[4774]: I1001 14:15:04.880577 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f2d4ce-a586-48bb-b821-71d8a9a988f8" path="/var/lib/kubelet/pods/e6f2d4ce-a586-48bb-b821-71d8a9a988f8/volumes" Oct 01 14:15:11 crc kubenswrapper[4774]: I1001 14:15:11.181723 4774 scope.go:117] "RemoveContainer" containerID="3354288fa35a9773464f7399acee45b3a9513b6dd5f49f515624373bba853ab6" Oct 01 14:15:12 crc kubenswrapper[4774]: I1001 14:15:12.870993 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:15:12 crc kubenswrapper[4774]: I1001 14:15:12.871583 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:15:12 crc kubenswrapper[4774]: E1001 14:15:12.871904 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:15:12 crc kubenswrapper[4774]: E1001 14:15:12.871917 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.076754 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-rhh9b_5beb15f9-6d7c-4a0e-b107-4b91e645f9a0/kube-rbac-proxy/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.228815 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-rhh9b_5beb15f9-6d7c-4a0e-b107-4b91e645f9a0/controller/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.352097 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-frr-files/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.544595 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-reloader/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.570831 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-metrics/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.605493 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-reloader/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.607631 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-frr-files/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.732587 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-frr-files/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.733518 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-reloader/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.760235 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-metrics/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.828609 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-metrics/0.log" Oct 01 14:15:19 crc kubenswrapper[4774]: I1001 14:15:19.997641 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-reloader/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.001037 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-metrics/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.007939 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/cp-frr-files/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.025399 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/controller/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.151082 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/frr-metrics/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.166754 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/kube-rbac-proxy/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.222816 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/kube-rbac-proxy-frr/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.354185 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/reloader/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.436217 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-8g948_bf28ca5c-6c07-4447-bfde-7c86bd08f4ae/frr-k8s-webhook-server/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.536934 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-59tlr_e42331ce-7b2f-48d8-81a3-9b8ecf89ec1e/frr/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.599853 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-597c4b7b96-2jb5j_aa824545-2e4e-49f3-ab37-05c10785acee/manager/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.704108 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7b9b85bd76-p6smt_b9d7f73a-7289-4825-9659-d330a9496ae1/webhook-server/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.756584 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2nzm9_1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc/kube-rbac-proxy/0.log" Oct 01 14:15:20 crc kubenswrapper[4774]: I1001 14:15:20.950943 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2nzm9_1ed85e6c-7ddf-4fd6-a2ab-cd71a422b0cc/speaker/0.log" Oct 01 14:15:25 crc kubenswrapper[4774]: I1001 14:15:25.872141 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:15:25 crc kubenswrapper[4774]: E1001 14:15:25.873933 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:15:26 crc kubenswrapper[4774]: I1001 14:15:26.870619 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:15:26 crc kubenswrapper[4774]: E1001 14:15:26.871411 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:15:32 crc kubenswrapper[4774]: I1001 14:15:32.993973 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_memcached-0_beae5224-51a1-4e93-9381-a10808afc6c1/memcached/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.112754 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-0_1a4a45b8-6786-400a-ad17-d6318d1d3da6/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.253796 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-0_1a4a45b8-6786-400a-ad17-d6318d1d3da6/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.260332 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-0_1a4a45b8-6786-400a-ad17-d6318d1d3da6/galera/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.301893 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-1_5660c969-322b-4ef6-a625-091735875ab7/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.463229 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-1_5660c969-322b-4ef6-a625-091735875ab7/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.472638 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-1_5660c969-322b-4ef6-a625-091735875ab7/galera/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.506021 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-2_6b74e8cc-1edb-4f88-89be-672909669498/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.637800 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-2_6b74e8cc-1edb-4f88-89be-672909669498/mysql-bootstrap/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.688762 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_openstack-galera-2_6b74e8cc-1edb-4f88-89be-672909669498/galera/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.732972 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_rabbitmq-server-0_5be2fd22-c494-44f3-889d-43561b4bfa34/setup-container/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.855500 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_rabbitmq-server-0_5be2fd22-c494-44f3-889d-43561b4bfa34/setup-container/0.log" Oct 01 14:15:33 crc kubenswrapper[4774]: I1001 14:15:33.895872 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/keystone-kuttl-tests_rabbitmq-server-0_5be2fd22-c494-44f3-889d-43561b4bfa34/rabbitmq/0.log" Oct 01 14:15:37 crc kubenswrapper[4774]: I1001 14:15:37.870942 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:15:37 crc kubenswrapper[4774]: E1001 14:15:37.871598 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:15:39 crc kubenswrapper[4774]: I1001 14:15:39.871133 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:15:39 crc kubenswrapper[4774]: E1001 14:15:39.873046 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:15:46 crc kubenswrapper[4774]: I1001 14:15:46.510890 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-utilities/0.log" Oct 01 14:15:46 crc kubenswrapper[4774]: I1001 14:15:46.721972 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-content/0.log" Oct 01 14:15:46 crc kubenswrapper[4774]: I1001 14:15:46.768804 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-utilities/0.log" Oct 01 14:15:46 crc kubenswrapper[4774]: I1001 14:15:46.816851 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-content/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.039301 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-content/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.039412 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/extract-utilities/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.407863 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-utilities/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.452235 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-pm5fh_374500ba-b989-44b3-bae2-1df03f16da01/registry-server/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.566010 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-utilities/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.606664 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-content/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.652746 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-content/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.821862 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-content/0.log" Oct 01 14:15:47 crc kubenswrapper[4774]: I1001 14:15:47.834555 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/extract-utilities/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.004589 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/util/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.212428 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/pull/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.216117 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wrzqd_a7665b71-d4e4-4a6a-88dd-a17afc725e54/registry-server/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.244878 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/pull/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.276810 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/util/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.480207 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/pull/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.491559 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/util/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.508529 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96k695t_2b9dc4bd-2e62-460b-b85d-f48db06a198f/extract/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.666320 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zndj2_3115994a-f7c2-410d-957b-0b08edee5125/marketplace-operator/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.667584 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-utilities/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.881986 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-content/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.900217 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-utilities/0.log" Oct 01 14:15:48 crc kubenswrapper[4774]: I1001 14:15:48.929524 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-content/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.051921 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-content/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.069179 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/extract-utilities/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.199346 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c9bgd_1fd81359-4e78-4195-958e-f6a4e859cf2c/registry-server/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.268020 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-utilities/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.358968 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-utilities/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.384992 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-content/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.412874 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-content/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.595570 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-content/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.633496 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/extract-utilities/0.log" Oct 01 14:15:49 crc kubenswrapper[4774]: I1001 14:15:49.998716 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cvpbg_a1970618-5299-4a91-a1c7-d767f8ed21d9/registry-server/0.log" Oct 01 14:15:50 crc kubenswrapper[4774]: I1001 14:15:50.870528 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:15:50 crc kubenswrapper[4774]: E1001 14:15:50.870912 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:15:52 crc kubenswrapper[4774]: I1001 14:15:52.870941 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:15:52 crc kubenswrapper[4774]: E1001 14:15:52.871713 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:16:04 crc kubenswrapper[4774]: I1001 14:16:04.870252 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:16:04 crc kubenswrapper[4774]: E1001 14:16:04.872098 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:16:05 crc kubenswrapper[4774]: I1001 14:16:05.870350 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:16:05 crc kubenswrapper[4774]: E1001 14:16:05.870745 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:16:16 crc kubenswrapper[4774]: I1001 14:16:16.870592 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:16:16 crc kubenswrapper[4774]: E1001 14:16:16.871335 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:16:20 crc kubenswrapper[4774]: I1001 14:16:20.870660 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:16:20 crc kubenswrapper[4774]: E1001 14:16:20.871489 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:16:28 crc kubenswrapper[4774]: I1001 14:16:28.880209 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:16:28 crc kubenswrapper[4774]: E1001 14:16:28.881760 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:16:35 crc kubenswrapper[4774]: I1001 14:16:35.870687 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:16:35 crc kubenswrapper[4774]: E1001 14:16:35.871369 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:16:41 crc kubenswrapper[4774]: I1001 14:16:41.870756 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:16:41 crc kubenswrapper[4774]: E1001 14:16:41.871808 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:16:48 crc kubenswrapper[4774]: I1001 14:16:48.877581 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:16:48 crc kubenswrapper[4774]: E1001 14:16:48.880714 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:16:49 crc kubenswrapper[4774]: I1001 14:16:49.821067 4774 generic.go:334] "Generic (PLEG): container finished" podID="e0e3def3-e253-47b3-a14d-6e3688e79870" containerID="683d34fa33c9a007701d8a86c5101b56ea014ac11e6f1b745cefe0f376d64d3b" exitCode=0 Oct 01 14:16:49 crc kubenswrapper[4774]: I1001 14:16:49.821154 4774 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kpntp/must-gather-jjm5w" event={"ID":"e0e3def3-e253-47b3-a14d-6e3688e79870","Type":"ContainerDied","Data":"683d34fa33c9a007701d8a86c5101b56ea014ac11e6f1b745cefe0f376d64d3b"} Oct 01 14:16:49 crc kubenswrapper[4774]: I1001 14:16:49.822029 4774 scope.go:117] "RemoveContainer" containerID="683d34fa33c9a007701d8a86c5101b56ea014ac11e6f1b745cefe0f376d64d3b" Oct 01 14:16:50 crc kubenswrapper[4774]: I1001 14:16:50.390442 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kpntp_must-gather-jjm5w_e0e3def3-e253-47b3-a14d-6e3688e79870/gather/0.log" Oct 01 14:16:52 crc kubenswrapper[4774]: I1001 14:16:52.871227 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:16:52 crc kubenswrapper[4774]: E1001 14:16:52.872005 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.570100 4774 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kpntp/must-gather-jjm5w"] Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.570951 4774 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kpntp/must-gather-jjm5w" podUID="e0e3def3-e253-47b3-a14d-6e3688e79870" containerName="copy" containerID="cri-o://1daa5ab06e13634a92e6fb39ca41d1d15e390319543cd2da43dbf213475ea00a" gracePeriod=2 Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.580200 4774 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kpntp/must-gather-jjm5w"] Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.909507 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kpntp_must-gather-jjm5w_e0e3def3-e253-47b3-a14d-6e3688e79870/copy/0.log" Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.910473 4774 generic.go:334] "Generic (PLEG): container finished" podID="e0e3def3-e253-47b3-a14d-6e3688e79870" containerID="1daa5ab06e13634a92e6fb39ca41d1d15e390319543cd2da43dbf213475ea00a" exitCode=143 Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.966273 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kpntp_must-gather-jjm5w_e0e3def3-e253-47b3-a14d-6e3688e79870/copy/0.log" Oct 01 14:16:57 crc kubenswrapper[4774]: I1001 14:16:57.966559 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.029221 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output\") pod \"e0e3def3-e253-47b3-a14d-6e3688e79870\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.029332 4774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kthvw\" (UniqueName: \"kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw\") pod \"e0e3def3-e253-47b3-a14d-6e3688e79870\" (UID: \"e0e3def3-e253-47b3-a14d-6e3688e79870\") " Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.048040 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw" (OuterVolumeSpecName: "kube-api-access-kthvw") pod "e0e3def3-e253-47b3-a14d-6e3688e79870" (UID: "e0e3def3-e253-47b3-a14d-6e3688e79870"). InnerVolumeSpecName "kube-api-access-kthvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.100618 4774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e0e3def3-e253-47b3-a14d-6e3688e79870" (UID: "e0e3def3-e253-47b3-a14d-6e3688e79870"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.130937 4774 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e0e3def3-e253-47b3-a14d-6e3688e79870-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.130977 4774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kthvw\" (UniqueName: \"kubernetes.io/projected/e0e3def3-e253-47b3-a14d-6e3688e79870-kube-api-access-kthvw\") on node \"crc\" DevicePath \"\"" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.883171 4774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e3def3-e253-47b3-a14d-6e3688e79870" path="/var/lib/kubelet/pods/e0e3def3-e253-47b3-a14d-6e3688e79870/volumes" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.918018 4774 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kpntp_must-gather-jjm5w_e0e3def3-e253-47b3-a14d-6e3688e79870/copy/0.log" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.918429 4774 scope.go:117] "RemoveContainer" containerID="1daa5ab06e13634a92e6fb39ca41d1d15e390319543cd2da43dbf213475ea00a" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.918494 4774 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kpntp/must-gather-jjm5w" Oct 01 14:16:58 crc kubenswrapper[4774]: I1001 14:16:58.943906 4774 scope.go:117] "RemoveContainer" containerID="683d34fa33c9a007701d8a86c5101b56ea014ac11e6f1b745cefe0f376d64d3b" Oct 01 14:17:03 crc kubenswrapper[4774]: I1001 14:17:03.871240 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:17:03 crc kubenswrapper[4774]: I1001 14:17:03.871959 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:17:03 crc kubenswrapper[4774]: E1001 14:17:03.872178 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:17:03 crc kubenswrapper[4774]: E1001 14:17:03.872308 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:17:15 crc kubenswrapper[4774]: I1001 14:17:15.870350 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:17:15 crc kubenswrapper[4774]: E1001 14:17:15.871153 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:17:17 crc kubenswrapper[4774]: I1001 14:17:17.870559 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:17:17 crc kubenswrapper[4774]: E1001 14:17:17.871435 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:17:28 crc kubenswrapper[4774]: I1001 14:17:28.876154 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:17:28 crc kubenswrapper[4774]: I1001 14:17:28.877006 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:17:28 crc kubenswrapper[4774]: E1001 14:17:28.877199 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:17:28 crc kubenswrapper[4774]: E1001 14:17:28.877530 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:17:42 crc kubenswrapper[4774]: I1001 14:17:42.870587 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:17:42 crc kubenswrapper[4774]: E1001 14:17:42.871315 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:17:43 crc kubenswrapper[4774]: I1001 14:17:43.870812 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:17:43 crc kubenswrapper[4774]: E1001 14:17:43.871114 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:17:55 crc kubenswrapper[4774]: I1001 14:17:55.871856 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:17:55 crc kubenswrapper[4774]: E1001 14:17:55.872864 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:17:56 crc kubenswrapper[4774]: I1001 14:17:56.874319 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:17:56 crc kubenswrapper[4774]: E1001 14:17:56.875080 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:18:07 crc kubenswrapper[4774]: I1001 14:18:07.873240 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:18:07 crc kubenswrapper[4774]: E1001 14:18:07.874593 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:18:08 crc kubenswrapper[4774]: I1001 14:18:08.888229 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:18:08 crc kubenswrapper[4774]: E1001 14:18:08.888773 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:18:19 crc kubenswrapper[4774]: I1001 14:18:19.870437 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:18:19 crc kubenswrapper[4774]: E1001 14:18:19.879694 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:18:23 crc kubenswrapper[4774]: I1001 14:18:23.870852 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:18:23 crc kubenswrapper[4774]: E1001 14:18:23.871395 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc" Oct 01 14:18:34 crc kubenswrapper[4774]: I1001 14:18:34.870861 4774 scope.go:117] "RemoveContainer" containerID="ba99d3abda6fd9ef7f6c2dd606c7e51158fda9be4030ea6da79f720ebe72f4dc" Oct 01 14:18:34 crc kubenswrapper[4774]: I1001 14:18:34.871633 4774 scope.go:117] "RemoveContainer" containerID="af3498e6ba1746e5acd5da88370d708d34d4b4c7cba57a36241c070e10fd9c65" Oct 01 14:18:34 crc kubenswrapper[4774]: E1001 14:18:34.871734 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-74ttd_openshift-machine-config-operator(18618ab0-7244-42b3-9ccd-60661c89c742)\"" pod="openshift-machine-config-operator/machine-config-daemon-74ttd" podUID="18618ab0-7244-42b3-9ccd-60661c89c742" Oct 01 14:18:34 crc kubenswrapper[4774]: E1001 14:18:34.872053 4774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=manager pod=keystone-operator-controller-manager-7d9d9bb4b5-fr745_openstack-operators(fe84b77c-3e6a-4244-8ef5-c6747459fabc)\"" pod="openstack-operators/keystone-operator-controller-manager-7d9d9bb4b5-fr745" podUID="fe84b77c-3e6a-4244-8ef5-c6747459fabc"